The unprecedented surge in artificial intelligence development is fundamentally reshaping the landscape of global infrastructure, particularly concerning data centers. The prevailing narrative often focuses on the computational prowess of AI models, yet the underlying reality points to a far more terrestrial challenge: power. As Flexential CEO Ryan Mallory articulated on CNBC’s Fast Money, "There is a ton of demand out in the marketplace for this power and network capabilities." This intense demand is not merely theoretical; it is driving a profound re-evaluation of how and where the digital backbone of AI is constructed.
Mallory, who leads Flexential, a hybrid IT infrastructure provider operating over 40 data centers across 19 major markets, spoke with Melissa Lee on CNBC's Fast Money to discuss the escalating demand for AI data centers in the U.S. and the industry's strategies for addressing this burgeoning market. His insights highlighted that the bottlenecks impeding AI infrastructure growth are multifaceted, extending beyond simple power generation to encompass the entire energy ecosystem.
A core insight from Mallory's commentary is that the primary constraint for scaling AI data centers is not a lack of power generation itself, but rather the intricate web of power transmission and distribution. While massive capital expenditure is being funneled into AI, the ability to deploy that investment hinges on robust and readily available energy grids. Mallory elaborated, stating, "The bottleneck isn't necessarily right with the power so to speak. We've got a lot of very sophisticated capabilities out in the industry right now looking at power generation, power transmission, and power distribution." This underscores that innovation in energy infrastructure, from smart grids to localized microgrids, is as crucial as advancements in AI chips themselves. The challenge lies in connecting substantial power sources efficiently to where the data centers need to be built, often requiring significant upgrades to existing utility infrastructure.
Another critical insight is the strategic imperative to explore and develop data centers in "non-traditional markets." Historically, data center hubs have congregated in areas like Northern Virginia and California due to favorable connectivity and established infrastructure. However, the sheer scale of AI's power requirements is forcing a geographical diversification. Mallory noted, "You have to look at some of non-traditional markets outside of Northern Virginia or California. There is power out there, and you just have to do your diligence to make sure your teams are executing and finding it." This geographical shift aims to tap into regions with greater power availability and potentially more amenable regulatory environments, distributing the load and fostering new economic centers for digital infrastructure.
The demand for power is so acute that traditional energy sources, specifically natural gas, are experiencing a resurgence in relevance. Despite a global push towards renewables, Mallory affirmed, "Natural gas plants are going to become a critical part of being able to meet and bridge some of these power requirements out there." This pragmatic approach acknowledges the immediate need for reliable, dispatchable power that can scale quickly to meet the colossal energy demands of AI training and inference. While long-term strategies lean towards sustainable sources, the interim reality dictates a reliance on proven, large-scale power generation.
Flexential, with its extensive network and over 330 megawatts of capacity built or under development, is actively navigating this power-intensive landscape. The company observes a significant shift in AI demand from primarily generative and training models, which typically require massive, centralized computing power, towards inference at the edge. This implies a need for smaller, distributed data centers closer to end-users, reducing latency and enabling real-time AI applications. These edge markets, often outside tier-one cities, demand a different infrastructure footprint but still require reliable, localized power solutions.
Related Reading
- Google's AI Surge Reshapes Hyperscaler Battle, Challenges Nvidia Narrative
- Meta’s Google AI Chip Talks Reshape Cloud and Silicon Battleground
Concerns regarding water usage in data centers, particularly for cooling, are often overstated, according to Mallory. He addressed this by clarifying, "Water is becoming a misnomer in the data center industry. A lot of the companies are building at a zero WUE or water use efficiency." Modern data center designs and cooling technologies are increasingly sophisticated, minimizing or eliminating reliance on local water supplies. This efficiency in water management helps mitigate environmental impact and addresses a common public relations challenge for data center operators.
The exponential growth of AI is creating a persistent gap between data center supply and demand, as illustrated by Goldman Sachs estimates showing U.S. demand for gigawatts consistently outstripping supply through 2028. This imbalance creates immense opportunity for companies like Flexential and the broader ecosystem of power generation, transmission, and cooling providers. The strategic alignment with power companies and the ability to identify and secure suitable sites with adequate energy resources are becoming critical differentiators in this rapidly expanding sector.



