Observations on the State of Physics and AI Research

Recent Nobel Prize discussions have sparked debates on the “death of physics” and the rise of low-quality research papers in both Physics and AI. Here’s my take on it.

Oct 12, 2024 4 min read

Well, the plot twist in this year’s Nobel Prize has sparked more debates than an Oscars afterparty! Adding to this, Sabine Hossenfelder has released a provocative video claiming that physics is on its deathbed due to a surge of theoretical work like string theory and a flood of “useless” papers churned out by academics. As always, it’s an interesting take from Sabine.


Although I haven’t had much time to blog lately, this video and the debates have prompted me to write this post which I have been meaning to write for a while, finally.

The first issue is: can any research afford to be infinite in duration without producing results?

Theoretical Inflation

For those unfamiliar, the string theory discussed in the video is the “Dilwale Dulhania Le Jayenge (DDLJ)” of physics, a Bollywood classic that has been screening daily in a Mumbai theater for 27 years. String theory has been playing on the physics stage for over 50 years, generating more than 10,000 research papers without delivering any “happily ever after” moment. However, unlike DDLJ, fans of string theory are still waiting for the big happy moment, while some brilliant minds continue to work on it. So, what’s the issue?

The problem lies in the prolonged focus on such theoretical concepts, which indicates a troubling trend of prioritizing unproven ideas over practical applications. If we look at the last century, physics was the blockbuster hit machine of science. We have witnessed discoveries that truly transformed our daily lives – quantum mechanics, particle physics, nanotechnology, and more. In contrast, things seem to have slowed down a bit for physics in the last decade. Sure, we’ve had exciting discoveries like gravitational waves and quantum computing, but it seems that more focus has shifted to theories rather than practical innovations, for example, string theory. Sure, it’s a fascinating stuff, but I can’t help but wonder how research has survived for over 50 years without any proof.

And those 10,000+ papers remind me of the rise of paper garbage, not just in physics but also in the age of AI.

The Paper Factory

Check out this abstract straight out of a future Nolan movie:

Abstract: This paper examines how fine-tuning large language models (LLMs) in the morning or cold weather improves performance. We hypothesize that these conditions enhance CPU thermal efficiency, benefiting Low-Rank Adaptation (LoRA) adapters. Additionally, LLMs are known to perform better in the morning due to reduced cognitive load, as fewer interactions keep the model’s memory fresh compared to evening sessions, a behavior characterized as ‘LLM fatigue.’ When CPUs exit sleep mode, the model is also more refreshed, reflecting the ‘Large Language Brain (LLB) phenomenon,’ which simulates optimal cognitive conditions for knowledge distillation and fine-tuning. This paper provides insights from simulations that evaluate performance enhancements resulting from these environmental conditions.”

For most readers, no further explanation is needed on the point I’m trying to make. However, for those who have not ventured into the AI ecosystem, welcome to the world of paper garbage.

While the above abstract may be exaggerated, the issue of paper garbage is real and a significant concern in, both physics and AI. I regularly come across papers that are just mathematical fiction without real-world testing or practical application, and some are outright ridiculous, often without any peer review. With tools like ChatGPT at our disposal, writing these papers has become even easier (the above abstract was generated by ChatGPT and not research). In a recent discussion, I highlighted a paper claiming a “novel” method to speed up model inference by using addition instead of multiplication—an idea that ironically originated from an old GCC discussion thread and is no longer as effective with new CPU architecture.

It’s well known that both physics and AI face significant pressure to publish for prestige, secure funding, branding, jobs, and more. But this “publish or perish” mentality over serious research work has created a situation where it’s all too easy to get lost in the garbage. While fantastic papers continue to get published showing meaningful research and helping the community, the surge of subpar papers makes it difficult for everyone to go through the noise and find valuable information, ultimately diluting the genuinely innovative work.

And finally, String theory Vs loop quantum gravity

For those who have watched the video and are intrigued, here are some interesting links to other articles and debates: