Energy In, Tokens Out

AI's future is constrained by energy, not computing power. AI's immense energy demands are outpacing efficiency gains, creating a "Jevons paradox." This necessitates a shift in focus towards energy supply and infrastructure. Companies must now evaluate their energy capacity with the same rigor as their digital assets. Renewable energy and battery storage systems (BESS) are crucial solutions. BESS helps data centers manage dynamic energy demands, reduce grid stress, and accelerate interconnection, making robust, intelligently managed power the key to unlocking AI's next phase.

Jun 9, 2025

AI may run on data, but it depends on energy. And today, energy—not compute—is the constraint shaping the future of AI.

OpenAI CEO Sam Altman echoed this point—noting that advanced AI will consume vastly more power than expected.

Altman is far from alone, Mark Zuckerberg and Elon Musk have shared similar views. Recently, NVIDIA CEO Jensen Huang stressed that the main bottleneck for AI is now electrical power, warning that the industry has become “power-limited” in its ability to scale.

In other words, the future of AI might best be summarized as “energy in, tokens out.” The outputs of AI – those countless tokens of text, images, and insights – ultimately depend on the electricity we can feed into our “AI factories.” Energy, not chips or data, is emerging as the fundamental constraint on AI’s growth.

AI’s Insatiable Hunger for Energy

The recent explosion in AI capabilities has come with an unprecedented rise in compute demand. Training a single large model can draw an astronomical amount of electricity – for example, training OpenAI’s GPT-3 model is estimated to have used nearly 1,300 MWh of electricity (about what 130 U.S. homes consume in a year). And models keep getting bigger and more complex.

While engineers continually improve chip efficiency (NVIDIA CEO Huang notes the push for the most energy-efficient architectures), these gains are struggling to keep pace with AI’s growth. Historically, computing efficiency followed Koomey’s Law, doubling energy efficiency roughly every 1.5 years. But AI usage is accelerating even faster – a classic case of demand outpacing efficiency.

Every leap in model size or quality requires orders of magnitude more computations, from the cloud data centers down to the factory floor devices. The result is that total energy consumption for AI is climbing sharply despite more efficient chips. In economic terms, the Jevons paradox is in full swing: make computing cheaper or more efficient, and we simply end up using a lot more of it.

Source: Appropedia

Even the most advanced algorithms and chips are ultimately useless if they can’t get enough volts from the wall. This is a pivotal shift: after decades of focusing on faster processors and bigger data, the cutting edge of AI now runs up against the physical limits of energy supply and infrastructure.

Turning an Energy Constraint into a Catalyst

The writing on the wall is clear: scaling AI without scaling energy will simply not be possible. Forward-looking companies will treat this as a mandate to innovate in how they produce, consume, and manage electricity.

For commercial and industrial firms, preparing for AI’s rising energy tide means evaluating your energy infrastructure with the same seriousness as your digital infrastructure. Do you have sufficient power capacity for planned expansions in automation and IT? How exposed are you to grid instabilities or peak pricing events? And what technologies – like on-site generation, storage, or advanced control systems – can you deploy today to buffer against these risks?

The encouraging news is that the solutions are already emerging, and many are economically attractive. The cost of renewable energy and batteries has plummeted over the past decade, making options like solar panels paired with storage one of the most promising paths to the “radically cheaper solar plus storage” future Altman envisioned.

Battery energy storage systems (BESS) are proving crucial for data centers seeking greater energy flexibility. By providing on-site power reserves, BESS enable data centers to manage their energy demand more dynamically, shifting consumption away from peak grid hours and mitigating the impact of grid instabilities. This enhanced flexibility reduces stress on the electrical grid and can accelerate the interconnection process for new and expanding data centers, as utilities are more amenable to connecting loads that can actively participate in demand-side management and reduce their peak impact on the local infrastructure.

In a very real sense, watts are becoming just as important as bytes. Ensuring plentiful, reliable, and intelligently managed power is the key to unlocking AI’s next chapter. The companies that recognize energy as the fuel of innovation – and proactively secure and streamline that fuel – will be the ones to fully capitalize on AI’s promise.

Share this post

Subscribe to our newsletter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Posh Energy