OpenAI CFO: 'Vertical Wall of Demand' for AI Compute

OpenAI CFO Sarah Friar describes an overwhelming 'vertical wall of demand' for AI compute, citing it as a key constraint on the company's growth.

4 min read
Bloomberg Tech segment with host and guest discussing OpenAI's demand for AI compute.
Image credit: Bloomberg Tech· Bloomberg Technology

In a recent Bloomberg Tech segment, OpenAI's Chief Financial Officer, Sarah Friar, offered a stark assessment of the company's current operational climate: an overwhelming 'vertical wall of demand' for artificial intelligence compute. Friar, a seasoned financial executive with prior roles at companies like Block, Inc. (formerly Square) and Snowflake, brings a deep understanding of scaling high-growth technology ventures. Her insights into OpenAI's trajectory underscore the immense pressure and opportunity facing leading AI development firms.

Friar's perspective comes at a pivotal moment for OpenAI, a company that has rapidly ascended to the forefront of generative AI with products like ChatGPT and DALL-E. The rapid adoption of these tools has created unprecedented demand for the computational resources required to train and operate them. This demand, Friar suggests, is not merely a surge but a continuous, steep incline.

The full discussion can be found on Bloomberg Technology's YouTube channel.

Related startups

OpenAI CFO Sees ‘Vertical Wall of Demand’ for Products - Bloomberg Technology
OpenAI CFO Sees ‘Vertical Wall of Demand’ for Products — from Bloomberg Technology

Sarah Friar: Driving Growth at AI's Frontier

Sarah Friar's tenure as CFO at OpenAI positions her as a key figure in managing the financial and operational complexities of one of the world's most influential AI research labs. Her background includes significant experience in scaling technology companies, navigating the challenges of rapid growth, and setting ambitious financial targets. At OpenAI, her role is critical in translating ambitious AI development goals into tangible business outcomes, ensuring the company has the resources to innovate while meeting market demand.

'Stretch Goals' and a 'Vertical Wall of Demand'

Friar revealed that OpenAI sets extremely high internal benchmarks. She stated, "Every company I've ever been inside of...always has stretch goals." This philosophy extends to their financial objectives, suggesting that the company consistently aims beyond readily achievable targets to drive maximum performance. However, Friar also candidly admitted that even these ambitious goals might be constrained by external factors. "And if you don't have those stretch goals, I feel like, actually, you're not doing your job as a CFO."

The core of her message, however, revolves around the immense, almost overwhelming, demand for AI compute. Friar described this demand as a "vertical wall of demand right now." This imagery suggests a situation where demand is increasing so rapidly that it appears almost insurmountable, creating a significant challenge for scaling operations. She elaborated on this, explaining that even when OpenAI meets its internal revenue targets, they are often doing so in an environment where compute availability is the limiting factor. "If we're in places where we're not hitting like targets, at the moment, I would actually say it's lack of compute that often is the thing that's slowing us down to some degree."

Compute as the Bottleneck

Friar's comments highlight a critical constraint in the AI industry: the availability of powerful computing infrastructure. Training large language models and other advanced AI systems requires vast amounts of processing power, typically provided by specialized hardware like GPUs. Companies like Nvidia (NASDAQ:NVDA), a primary supplier of these chips, have seen their demand skyrocket. Friar's statement suggests that despite OpenAI's efforts to secure these resources, the sheer scale of demand from both internal development and external customer use cases is outstripping supply.

The implications of this compute bottleneck are far-reaching. For OpenAI, it means that the pace of developing new models and scaling existing services like ChatGPT could be directly tethered to their ability to acquire more compute. This could influence their strategic partnerships, their investment in hardware, and their approach to model optimization. For the broader AI sector, it signals a potential constraint on innovation and deployment, driving competition for limited computational resources.

Strategic Shifts and Future Outlook

The discussion also touched upon recent strategic adjustments, including reports of OpenAI scaling back plans in some regions due to compute constraints. Friar confirmed that while the specific details of compute deals may evolve, the underlying need for more processing power remains constant. The company is focused on serving its latest models, such as those powering its image generation tools like DALL-E, which also demand substantial computational resources.

Friar's perspective emphasizes a business strategy that is not just about building the best AI models but also about securing the necessary infrastructure to deploy them effectively and meet market appetite. The 'vertical wall of demand' is a powerful metaphor for the current state of AI, where technological capability is rapidly outpacing the physical and computational resources required to realize its full potential. This situation presents both a significant challenge and a clear signal of the immense commercial value and latent demand within the AI space.

© 2026 StartupHub.ai. All rights reserved. Do not enter, scrape, copy, reproduce, or republish this article in whole or in part. Use as input to AI training, fine-tuning, retrieval-augmented generation, or any machine-learning system is prohibited without written license. Substantially-similar derivative works will be pursued to the fullest extent of applicable copyright, database, and computer-misuse laws. See our terms.