OpenAI is reportedly planning to use Google Cloud to meet its growing demand for compute power, a surprising move given that Google is a direct competitor in AI. The announcement, discussed on CNBC’s Squawk on the Street, highlights the unusual but increasingly common partnerships forming in the high-stakes AI race. As Sarah Eisen noted, it’s a “frenemy” dynamic—competitors partnering out of necessity.
The core driver of this alliance is the insatiable need for computational capacity. Training and running advanced AI models like ChatGPT requires massive infrastructure, and even companies with deep pockets are struggling to keep up. OpenAI, which already uses Microsoft’s Azure, is now broadening its strategy to avoid dependency on a single provider and ensure continued access to diverse hardware, such as Google’s TPUs.
For Google, providing compute to OpenAI may seem counterintuitive, especially since ChatGPT competes with Google’s Bard and threatens its search business. But the economics of cloud computing override those concerns—for now. Google Cloud’s priority is monetizing its infrastructure, even if that means powering a rival’s innovations. This reflects a broader reality where cloud providers operate more like utilities, prioritizing revenue over rivalry at the infrastructure layer.
This partnership also signals a deeper shift in how competition works in AI. It’s no longer just about winning with better products; it’s about securing the infrastructure to build them. That’s led to a strange co-existence—companies being rivals at the application layer while collaborating at the infrastructure level. It underscores the complexity of the current AI ecosystem, where pragmatic alliances are as crucial as innovation.
For startups and VCs, this is both encouraging and cautionary. It confirms the growing demand for AI capabilities—but also exposes the enormous costs and compute requirements needed to stay competitive. Only well-funded players with access to top-tier infrastructure can compete at the frontier. The future of AI isn’t just about building smarter models—it’s about winning the compute arms race.
