As artificial intelligence becomes a cornerstone of the digital economy and data centres consume more and more energy, the IT industry is beginning to look at ways to rebuild the foundations of its efficiency. One of the most surprising directions for this transformation may be… deep cold. Literally.
A team of researchers from the Forschungszentrum Jülich, RWTH Aachen, EPFL, TSMC and Japanese universities suggests that CMOS chips, which have been familiar for decades and drive virtually all electronics, can operate much more efficiently at very low temperatures. With the right materials and architectures, energy savings of up to 80% are possible, according to the researchers.
This approach, while sounding futuristic, fits in with the increasingly pressing need to dramatically increase energy efficiency in data centres.
From heat to cold – a paradigm shift
Modern chips are designed to operate at room temperature. They generate significant amounts of heat, which must be dissipated efficiently – which in itself requires a huge amount of energy. According to the International Energy Agency (IEA), data centre electricity demand could double by 2030, mainly due to the growth of AI and the cloud.
But what if, instead of managing heat, it should simply be avoided? Cryogenic computing, i.e. carried out at temperatures close to absolute zero, can enable a significant reduction in the voltage needed to switch transistors. And lower voltage means less loss, less heat and more efficiency.
In theory, the lower the temperature, the more ‘disciplined’ the electron behaviour and the lower the threshold voltage needed for switching. In practice – savings of 70% can be achieved at -196°C (77 K, achieved with liquid nitrogen). In extreme cases, with helium cooling down to 4 K, energy gains reach 80%.
Physical limitations and material revolution
The snag is that traditional CMOS chips are not optimised to operate at such low temperatures. Physical phenomena come into play here that are negligible in ‘warm’ conditions, but dominate in the cold. We are talking about band-tail effects, material defects and quantum phenomena such as electron tunneling.
In order to achieve the claimed efficiency, it is therefore necessary to change not only the operating conditions of the systems, but also their very design. The research team points in specific directions: the use of nanowires, SOI structures, dielectrics with high electrical permeability or materials with a narrow energy gap. All with the aim of creating what the researchers call a ‘super-transistor for the cold’.
This is no longer just an architectural adjustment. It is the potential opening of a new chapter in electronics design – adapted not to the engineer’s desk, but to the inside of the cryostat.
From quantum computers to hyperscale
Although cryogenic computing is mainly associated with quantum computers, its potential applications are much broader. Chips optimised for low-temperature operation could find applications in medical imaging, space missions and, most importantly, in classic data centres. Where thousands of processors are installed and megawatts of power are consumed, each energy saving translates into tangible millions of dollars per year.
At the same time, it is worth emphasising that the technology is not in a purely academic phase. TSMC – the world’s largest chip manufacturer, supplying chips to Apple, AMD and Nvidia, among others – is involved in the study. This signals that the topic is no longer just a scientific curiosity, but a viable direction for the semiconductor industry.
What does this mean for the IT market?
Cold chips won’t hit the market tomorrow, but long-term they could prove to be the answer to several simultaneous challenges: the limitations of Moore’s Law, the energy demands of AI and the rising operational costs of data centres. Their implementation will require not only a materials revolution, but also a change in IT infrastructure design – from cooling to the integration of low-temperature electronics.
However, it is worth keeping a close eye on this direction. Because if the current rate of growth in the demand for computing power continues, the IT sector will have to reach for every tool available to reduce its energy intensity – even if this means descending to temperatures hitherto known mainly from quantum physics laboratories.