The global technology debate has been revolving around one topic for several years: the availability of GPUs. The race for Nvidia’s chips, reminiscent of the modern gold rush, is being portrayed as a key front for the AI revolution .
Companies and countries are bidding for supplies of technological ‘picks’, believing that whoever has more of them will dig up the digital wealth of the future. But what if the real limitation is not the availability of tools, but the lack of land on which to mine?
Recent market analyses, including those from Gartner analysts, shed a whole new light on the matter. They suggest that the narrowest throat for the development of artificial intelligence is shifting from silicon factories towards a much more mundane infrastructure – power grids.
The problem is no longer a lack of chips, but starting to be a lack of power and space to plug them all in. The battle for the future of AI is no longer just about teraflops, but increasingly about megawatts.
Hungry like AI – Energy Appetite Scale
To understand the scale of the challenge, you need to look into the engine room of this revolution. The process of training a single large-scale language model (LLM), such as those powering advanced chatbots, requires thousands of specialised processors to work for weeks, even months, without a moment’s pause. These are operations with an unimaginable appetite for energy.
The scale of the phenomenon is best illustrated by comparisons. It is estimated that one complex query to generative AI can consume up to ten times more energy than a simple Google search.
Going further, a modern data centre built specifically for AI workloads needs as much electricity as a city of several thousand inhabitants. And more and more such centres are being built. Market data leaves no illusions – investment in AI servers is almost doubling the value of this segment, and overall data centre investment is growing at a rate of more than 75% year-on-year.
This surge in demand for computing power is translating into an explosion in energy demand that global grids are simply not ready for.
Consequences here and now: Where to plug in all this intelligence?
This energy dilemma already has very real global consequences. Access to cheap, stable and, increasingly, green energy is becoming a key strategic asset. ‘AI energy hubs’ are emerging on the new technological world map.
Scandinavia, with its abundance of hydropower, or the Middle East, investing billions in solar farms, are becoming magnets for major technology players.
At the same time, rising energy costs are becoming one of the main components of the final price of AI-based services. Rising electricity prices will inevitably translate into more expensive access to advanced models, which could deepen the digital and economic divide in the future.
However, the biggest challenge is the physical infrastructure. As analysts aptly point out, the problem is becoming a lack of ‘space to connect servers’. Building new high-voltage lines or transformer stations are processes that take years.
This pace is out of step with the exponential growth of the technology industry, which has become accustomed to cycles measured in months. The digital revolution is colliding with the brutal realities of civil engineering and energy.
A race against time – How is the industry trying to ‘cool’ AI’s appetite?
However, the technology world is aware of the growing problem and is working intensively on solutions. This race against time is taking place on several fronts. The first is optimisation. Engineers are switching from traditional air cooling of servers to much more efficient liquid cooling. This allows for denser hardware packing and better heat management, although it does not solve the root problem.
The second front is efficiency. A race is underway to design more energy-efficient chips and accelerators, as exemplified by proprietary units being designed by Google or AWS. In parallel, work is being done to optimise the AI models themselves to achieve similar results with less computing power.
Most futuristic, but taken deadly seriously, is the energy front. Giants such as Microsoft are openly investing in fusion energy research and have declared their intention to use small modular reactors (SMRs) to power their future data centre campuses.
This best demonstrates that for industry leaders, providing gigawatts of power has become a priority equal to creating intelligent algorithms.
The AI revolution is entering a crucial phase in which its pace will be dictated not only by Moore’s Law, but also by the laws of physics and the limitations of the material world. The race for supremacy will be won not only by those with the sharpest algorithms, but by those who can provide them with stable and powerful power.
So before we once again ask what else artificial intelligence can do for us, we need to answer a much more down-to-earth question: where do we find an outlet for it?