"I do not believe we're in an AI bubble today," declared Gavin Baker, Managing Partner and CIO of Atreides Management, setting a provocative tone for his discussion with David George, General Partner at a16z. This assertion, delivered at a16z's Runtime event, anchored a sharp analysis of the current AI boom, differentiating it starkly from past tech exuberance and illuminating the fierce competitive dynamics shaping its future. The conversation delved into the macro economics of AI, the trillion-dollar infrastructure buildout, and the profound implications for founders, investors, and the global economy.
David George opened with startling statistics that, on the surface, might suggest a bubble: a projected $3-4 trillion in data center capacity to be added in the US over the next five years, dwarfing the 40-year investment in the US Interstate Highway System. OpenAI alone has secured commitments amounting to a trillion dollars in deals. Yet, this infrastructure expansion is met with an equally staggering demand, evidenced by Google's reported 150x increase in tokens processed over just 17 months.
Baker quickly drew a crucial distinction from the dot-com era. The year 2000 was defined by "dark fiber," where a staggering 97% of laid fiber optic cable sat unused, a monument to speculative overbuilding. Today, however, there are "no dark GPUs." This simple phrase encapsulates a fundamental difference: the current surge in AI infrastructure is not speculative; it's being consumed as fast as it can be produced. GPUs are literally "melting" from intense usage, a testament to the insatiable demand for AI compute. So far, the return on invested capital for the largest public companies heavily spending on GPUs has seen a positive 10-point increase. This time, the investment is being actively utilized, yielding tangible returns.
Related Reading
- El-Erian Warns of AI's "Rational Bubble" and Unaddressed Risks
- Celestica CEO Mionis: AI is a Must-Have Utility, Not a Bubble
- AI Investment Cycle: Early Innings, Driven by Fundamentals
The battle for AI dominance, according to Baker, is an existential fight for Big Tech, centered on controlling the full stack from infrastructure to applications. He posited that Nvidia's most formidable competitor isn't another chipmaker like AMD or Broadcom, but Google. Google's proprietary TPU chip, coupled with its internal AI labs like DeepMind and Gemini, gives it an unparalleled strategic advantage. "Google owns the TPU chip," Baker stressed, highlighting how this vertical integration allows Google to optimize its AI stack, driving innovation and efficiency. This control fosters a powerful data flywheel: more users generate more data, which improves models, which attracts more users—a virtuous cycle that solidifies competitive advantage. For companies like Google and Meta, winning this race is not merely about market share; it is about survival.
While the infrastructure layer is experiencing unprecedented growth and intense competition, the application layer of AI is still in its nascent stages. Baker likened the current moment to the early internet, where Netscape Navigator was a groundbreaking browser, but Google and Facebook were yet to be conceived. "It's just very early," he stated, reminding the audience that the true killer applications and dominant players have likely not yet emerged. This early stage also suggests a fundamental shift in business models. Unlike traditional SaaS, which boasts 80-90% gross margins, AI's compute-intensive nature will lead to structurally lower gross margins. However, Baker argued, this doesn't preclude them from being great businesses. He drew parallels to the cloud transition, where companies shifted from high-margin on-premise software to lower-margin cloud services, ultimately thriving. For application SaaS companies, embracing lower gross margins as a "mark of success" – indicating actual AI usage – will be crucial. The future will increasingly pivot to outcome-based pricing, a model where value is measured by results, not just access or usage, and inefficiencies in current advertising models will be squeezed out.

