The relentless pursuit of artificial intelligence has inadvertently transformed the humble data center, turning once-commoditized infrastructure into a critical technological bottleneck and a lucrative new frontier for traditional industrial powerhouses. This profound shift, driven largely by Nvidia's accelerated product cycles and the insatiable power demands of AI, is forcing a complete re-architecture of the physical backbone of the digital world.
Andrew Obin, a Senior Industrials Analyst at BofA Securities, recently spoke with CNBC's Fast Money about this underappreciated revolution, highlighting how companies like GE, Honeywell, Eaton, Parker, and Rockwell are now central to the burgeoning data center boom. His analysis delves into why electrical and cooling systems, previously considered afterthoughts, have become highly specialized, high-value components, and where discerning investors should seek opportunities in this evolving infrastructure trade.
Historically, data centers operated on a more conventional scale, typically around 10 megawatts, utilizing standard 12-volt power and off-the-rack HVAC and electrical systems. As Obin explains, "Before, right, the electrical stuff, the cooling stuff, uh, they were basically commoditized." Reliability was paramount, prompting large players to opt for top-tier suppliers, but the underlying technology remained largely undifferentiated. This era treated power and cooling as necessary utilities, rather than integral, high-tech components.
Today's landscape is dramatically different. The average data center now clocks in at approximately 100 megawatts, a tenfold increase in power demand. This escalating power density is not merely an incremental change; it necessitates entirely new approaches to design and operation. The cost per megawatt has consequently soared, with Obin noting, "Now an average data center is around 100 megawatts, right? The cost per megawatt is 50 million." This staggering figure underscores the immense capital expenditure now required to build and maintain these hyperscale AI factories.
The catalyst for this transformation is the advanced computing architecture underpinning modern AI, particularly the high-performance GPUs from companies like Nvidia. These powerful chips generate unprecedented levels of heat and require significantly more sophisticated power delivery. Obin emphasizes, "All of a sudden they're starting to do something very different with the rack. So all of a sudden the HVAC system, the cooling system that you had before, is not going to work. All of a sudden the electrical stuff, when they're going to go to the Kyber rack, you will effectively will have to rip out all your electrical stuff inside the data center and put in brand new electrical equipment." This isn't an upgrade; it's a wholesale replacement.
This new reality means data centers are shifting from 12-volt systems to higher direct current (DC) architectures, potentially running at 400 or even 800 volts. Such a fundamental change in electrical infrastructure requires specialized switchgear, power distribution units, and uninterruptible power supplies (UPS) that can handle these increased voltages and currents safely and efficiently. The former commoditized components are now custom-engineered solutions.
Similarly, traditional air cooling is no longer sufficient for the intense heat loads generated by AI-optimized racks. Liquid cooling, including immersion cooling, is becoming essential, leading to a surge in demand for advanced chillers, cooling towers, and sophisticated fluid management systems. These are not minor adjustments but critical engineering feats.
The financial implications are substantial. Out of that $50 million per megawatt data center cost, Obin estimates roughly $1.9 million is allocated to electrical infrastructure and $1.4 million to cooling. If these highly specialized systems fail, the consequences are immediate and catastrophic. "You mess that up," he warns, "40 million dollars worth of chips per megawatt, pretty much burn within 30 seconds." This stark reality elevates the importance of these industrial components from mere utilities to mission-critical technology.
This shift has created a new class of "winners" among industrial companies. Firms like Eaton, Schneider Electric, and Vertiv, which provide switchgear, power distribution, and UPS systems, are experiencing unprecedented demand for their specialized electrical solutions. SPX Technologies, Daikin Industries, Trane Technologies, and Johnson Controls, leading manufacturers of cooling towers and chillers, are at the forefront of the thermal management revolution. Even generator makers like Caterpillar, Rolls-Royce, and Cummins are seeing increased opportunities as data centers require robust backup power for their expanded operations.
These industrial players are no longer just suppliers; they are integral partners in the AI ecosystem, their innovations directly enabling the next generation of computing. "This has become the technology bottleneck," Obin concludes, "without which Nvidia's plan doesn't work." The AI revolution, while often framed through the lens of chips and software, is equally dependent on the physical infrastructure that powers and cools it. Understanding this interdependence is crucial for anyone navigating the future of technology and investment.

