The AI Infrastructure Buildout is Still Early and Investors Are Stuck on Training

4 min read
The AI Infrastructure Buildout is Still Early and Investors Are Stuck on Training

“I think that this can be one of those rare cases when both companies can win. Both companies can have massive advertising businesses within AI, adjacent to commerce.” This assertion by Gene Munster, Managing Partner at Deepwater Asset Management, cuts against the prevailing zero-sum narrative dominating the AI investment landscape, particularly concerning the rivalry between Alphabet (Google) and OpenAI. The conventional wisdom often pits the established search behemoth against the disruptive startup, but Munster argues that the revolution is expanding the total addressable market so significantly that coexistence and mutual growth are highly probable outcomes.

Munster, speaking on CNBC's Closing Bell, provided a sharp analysis of the current state of the AI trade, highlighting not only the competitive dynamics in consumer-facing applications like advertising but also the fundamental misconceptions investors hold regarding the backend infrastructure necessary to power the AI economy. His commentary serves as a crucial check for founders and VCs focused on where the true long-term value creation resides.

The idea that both Google and OpenAI can thrive in the advertising space stems from an observed shift in consumer behavior. As large language models become integrated into daily workflow, consumers are increasingly seeking information through conversational interfaces—whether via a traditional search bar enhanced by AI or a dedicated chatbot. This expansion of information seeking, Munster posits, means the total time spent searching for information is increasing, offering new real estate for monetization. The fact that the interviewer used an LLM to select jewelry for her outfit highlights the immediacy and breadth of this application shift, demonstrating that AI is moving beyond purely technical tasks into consumer commerce and lifestyle curation—areas ripe for contextual advertising.

This broadening application layer relies on an infrastructure buildout that is still in its nascent stages. According to Munster, the AI trade has successfully navigated the first two checkpoints of 2024—positive guidance from TSMC driven by accelerating AI demand, and encouraging commentary from Nvidia at CES. The next critical hurdle, expected in late January and early February, will be the capital expenditure (CapEx) guidance from the hyperscalers (Google, Microsoft, Amazon). Expectations for hyperscaler CapEx growth have already inched up from 30% to 34%, and Munster anticipates they may hit 40% or more, indicating a deep commitment to the necessary hardware foundation.

Yet, a deeper, more structural misconception persists among investors regarding this infrastructure buildout. The current narrative often centers on training—the expensive, computationally intensive process of building and refining the large language models themselves. Munster argues that this focus misses the forest for the trees, noting that investors are “stuck in the mentality that AI is about training.”

The long-term economic reality, driven by the sheer volume of daily usage, points overwhelmingly toward inference. Inference is the process of running the trained models to generate results—the actual "thinking" and application of intelligence by the AI. Munster cites Deepwater’s internal research and industry data to underscore the massive scale difference: "That inference market last year in 2025 was five times as big as the training market." Looking forward, this disparity will grow exponentially.

The intelligence of AI is not in the training, but in the application. As millions of users, businesses, and specialized agents begin to rely on these models for daily tasks, the computational cycles dedicated to inference will dwarf those dedicated to training. Munster emphasizes the scale of this future demand, stating that "Inference will be hundreds of thousands of times bigger than training; intelligence is in thinking, not in the training." This disparity suggests that companies focused on optimizing inference—the real-time deployment and scaling of AI—are positioned for explosive growth far beyond those merely focused on model creation.

This infrastructure dynamic has immediate implications for market structure. While the "Mag 7" (the largest technology companies) have dominated the initial AI run-up, the sheer scale of the required infrastructure investment, combined with the broadening need for specialized inference chips and services, suggests that the AI opportunity will trickle down. Munster maintains his prediction that small-cap companies focused on the infrastructure and application layers will begin to outperform their large-cap counterparts, driven by strong earnings revisions as the inference market accelerates. The market strength seen recently in companies like AMD, following positive supply chain news, is an early indicator of this broadening participation in the infrastructure supply chain.

For founders and VCs, the takeaway is clear: the AI gold rush is shifting from digging for gold (training) to selling shovels and processing tools (inference). The next phase of massive value creation will be found in the companies that enable the affordable, scalable, and ubiquitous deployment of AI intelligence across every facet of technology and commerce, confirming that the AI infrastructure buildout is far from complete.