The pursuit of Artificial General Intelligence is not merely an aspiration; it is an active "race to the frontier," characterized by models continuously "leapfrogging each other," as articulated by Denny Fish, Tech Research Head and Portfolio Manager at Janus Henderson. This profound insight underpins the current dynamic in the AI landscape, signaling a sustained, aggressive phase of development and investment that shows no signs of abating. Fish’s commentary on CNBC's 'Money Movers' offered a trenchant analysis of the AI race, the implications for chip stocks, and the tangible realities governing the expansion of critical AI infrastructure.
Fish’s perspective is rooted in a consistent view of the evolving AI ecosystem. He observes major players like Google, Meta, Anthropic, OpenAI, and Microsoft all fixated on the distant, yet increasingly visible, goal of AGI. This shared objective fuels an incessant cycle of innovation. Each new model release—from successive versions of ChatGPT and Gemini to emerging platforms like Claude and XAI—demonstrates significant advancements, validating the belief that progress will continue in a series of dramatic, competitive leaps. This iterative improvement, far from being a transient phenomenon, is a fundamental characteristic of the current AI paradigm, driving both technological breakthroughs and significant capital expenditure.
The remarkable performance of key industry players and the consistent evolution of AI models underscore a crucial point: "scaling laws are firmly intact and we are still in an aggressive phase of GPU, TPU accelerator demand as we go through this training phase before we even get to the knee of the curve of inference." This statement from Fish highlights that the foundational principles governing AI model performance—where increased computational power and data lead to disproportionately better results—are holding strong. The implication for founders and investors is clear: the demand for specialized hardware designed for AI workloads, particularly GPUs and TPUs, will remain exceptionally high as companies continue to pour resources into training ever-larger and more complex models.
Indeed, a cessation of this rapid advancement would be a far greater concern than any perceived market slowdown. When the models stop leaping each other, that would actually be cause for concern because that would suggest that scaling laws are starting to diminish. Such a scenario, currently unseen, would signify a fundamental bottleneck in AI progress, potentially dampening the long-term investment thesis.
Related Reading
- Meta’s Google AI Chip Talks Reshape Cloud and Silicon Battleground
- AI Market Maturation Driven by Broadening Competition
Despite the aggressive pace of technological advancement and investment, Fish emphasizes that the expansion of AI infrastructure is not without its inherent limitations. Unlike the relatively unconstrained build-out of fiber optic networks or cell towers during the dot-com era, the current AI infrastructure boom faces "natural governors." These include the physical constraints of data center construction, the complexities of obtaining permits, and crucially, the challenge of securing sufficient power to run these energy-intensive facilities. This makes the current build-out a "really prolonged" endeavor, distinct from past speculative bubbles.
The need for specialized, powerful hardware like GPUs and TPUs further complicates this picture. These are "short-lived assets," meaning they have a relatively rapid obsolescence cycle due to continuous innovation. They cannot be deployed until the physical data center shell is complete and powered, creating a bottleneck that ensures sustained demand. Fish notes that for "every GPU that comes off of TSMC's foundry... there're ten people in line for it, and we are just getting started at putting these short-lived assets into data centers." This intense, persistent demand, coupled with the real-world constraints on supply and deployment, creates a unique investment landscape where foundational infrastructure remains a critical, high-value component of the AI revolution. The market may take periodic breaths, but the underlying trajectory of infrastructure demand remains robust, driven by the enduring promise of AI.

