AI's relentless march forward, promising transformative capabilities, casts a growing shadow: its prodigious appetite for energy and water. This hidden cost, often overlooked in the dazzling light of innovation, is rapidly becoming a critical concern for infrastructure, sustainability, and the very future of the digital economy.
Wall Street Journal Senior Personal Tech Columnist Joanna Stern embarked on an illuminating journey to "Data Center Valley" in Ashburn, Virginia, to demystify the immense energy footprint of artificial intelligence. Her investigation, punctuated by a relatable, if unusual, analogy involving an electric grill and steaks, highlighted the surprising scale of power required to fuel our AI interactions.
By 2028, AI data centers are projected to consume up to 12% of all U.S. electricity, enough to power over 55 million homes for a year. Every AI prompt, from a simple text query to a complex video generation, routes to these colossal facilities, triggering a process known as "inference" where powerful models think and respond.
Despite the escalating demand, major generative AI companies like Google, Microsoft, and Meta remain tight-lipped about their specific energy consumption figures. Researchers, however, are working to establish benchmarks. Sasha Luccioni, AI and Climate Lead at Hugging Face, emphasized the need "to use a standardized methodology... and compare different models in terms of energy usage."
Stern’s grill-based demonstration offered a tangible sense of individual AI impact. Generating a single image, for instance, required approximately 1.7 watt-hours (Wh) of energy, roughly equivalent to four seconds of grilling a thin steak. A short video, however, could demand anywhere from 20 to 110 Wh. To produce her short film using AI, Stern estimated consuming roughly 110,000 Wh for 1,000 video clips. This staggering amount equates to grilling around 478 steaks or powering an average U.S. home for three and a half days. Sam Altman, CEO of OpenAI, has publicly stated that an "average query [on ChatGPT] uses about 0.34 watt-hours."
The core of this energy intensity lies in Graphics Processing Units (GPUs), particularly NVIDIA's high-end H100s. These chips, which can cost up to $350,000 each, are so powerful they generate immense heat. Keeping them cool is a monumental task, often involving sophisticated air and liquid cooling systems that themselves consume significant energy and water. While some data centers, like the Equinix facility Stern visited, employ closed-loop water systems to minimize waste, others rely on methods that lead to considerable water loss through evaporation or dumping. The relentless demand for cooling is a substantial part of the operational cost.
NVIDIA asserts that its newer chips offer a "30 times improvement in energy efficiency" compared to older models, signaling a positive trajectory. Yet, the sheer scale of AI adoption means overall energy consumption will continue its upward trend. As Chris Kimm, SVP Customer Success at Equinix, noted, "Most equipment can run comfortably up to about 90-92 degrees Fahrenheit. Beyond that, it'll power itself off or literally break." The industry faces an imperative to balance innovation with environmental stewardship, demanding greater transparency and efficiency from its most powerful tools. The ultimate question remains: is the computational cost of every AI interaction, from trivial video generations to life-saving drug discoveries, truly justified? As Sasha Luccioni aptly suggested, "If people saw how much energy was being used for each silly cat video that they generated, maybe they’d think twice."

