New IT infrastructure: how companies are designing AI environments without energy compromises

Artificial intelligence is changing how companies approach IT infrastructure design - it's no longer just computing power that matters, but also energy efficiency and flexibility. Technologies such as liquid cooling are playing an increasingly important role in reconciling performance with concerns about operating costs and sustainability.

6 Min Read
Data center dane

Performance is not enough for IT infrastructures to meet the demands of generative AI. Increasing workloads, higher power consumption and the need for scalability are forcing a new approach to the design of AI-ready environments.

Companies no longer want ‘more power’ – they want more balance: between computing power, energy efficiency, scalability and operational costs. This shifts the burden of decision-making from pure hardware specifications to a systems approach to IT infrastructure.

AI is changing organisations’ priorities

The increase in the demand for computing power by AI systems is changing the way organisations approach data centre planning. “The development of artificial intelligence is one of the key factors redefining IT architecture and approaches to energy efficiency in data centres. Organisations are increasingly looking for solutions that deliver high performance with reduced energy requirements, while enabling scalability and support for advanced workloads. A key element of this transformation is the use of innovative technologies such as Direct Liquid Cooling (DLC), which is playing an increasingly important role in AI-ready architectures.” – says Karolina Solecka, Compute Sales Director at Hewlett Packard Enterprise Poland.

Karolina Solecka, HPE
Karolina Solecka, HPE

Until recently, maximum efficiency was the main criterion. Today, companies increasingly want efficiency without energy overload. This is due to both rising energy costs and environmental pressures (ESG). Efficiency is no longer an add-on – it is becoming a requirement.

Infrastructure design starts with cooling

AI not only increases the energy demand – above all, it generates heat that can no longer be dissipated by classical methods. Traditional air cooling is no longer sufficient with increasing computing density.

“AI systems require enormous computing power, which generates significant amounts of heat. Traditional cooling methods, based on air exchange, are no longer effective at high computing densities,” – explains an expert from HPE.

This means that the physical architecture of the infrastructure – from cabinet spacing to air circulation – must be redesigned. Cooling becomes a starting point rather than an addition to the infrastructure.

DLC: key transformation technology

A solution that is gaining in importance is Direct Liquid Cooling (DLC) – liquid cooling directly from components such as the CPU and GPU. Compared to traditional methods, DLC significantly reduces energy consumption and increases the efficiency of computing environments.

“This is why HPE is investing so heavily in the development of liquid cooling technology, which enables direct heat removal from components such as CPUs and GPUs.” – Karolina Solecka emphasises.

Energy savings can be as high as 30-40% compared to air cooling. But that’s not all – DLC also allows for **compact data centre design, which is especially important for companies with limited space or planning edge deployments.

“DLC not only increases energy efficiency, reducing energy consumption by up to 30-40%, but also allows for a more compact data centre design.”

New customer questions: how to optimise rather than maximise

The way customers approach the purchase and deployment of IT infrastructure is also changing. The focus is no longer just on ‘maximum power’, but on sustainability, controlling consumption and optimising TCO (total cost of ownership).

“Customers are increasingly driven not only by maximising performance, but also by energy efficiency and sustainability,” notes the HPE representative.

This shift in priorities is linked to increasing pressure for environmental reporting, but also to real operational needs – companies don’t want to pay for energy they can’t control. They want environments that can be monitored, scaled and optimised for actual usage.

The service model supports an energy-efficient approach

In this context, “as-a-service” models are gaining importance – flexible, billed on the basis of actual resource consumption. Such solutions allow customers to avoid oversizing their environment and thus reduce unnecessary energy and cooling consumption.

“Customers are increasingly driven not only by maximising performance, but also by energy efficiency and sustainability. Technologies such as DLC achieve this balance while providing support for advanced AI workloads. Businesses also appreciate the flexibility of ‘as-a-service’ models, such as HPE GreenLake, which allow infrastructure to adapt to changing business needs while minimising operational costs.” – Solecka says.

This not only makes the technology more accessible, but also more efficient – both energetically and financially.

AI-ready architecture from the ground up

The development of artificial intelligence requires the whole approach to IT infrastructure to change. It is no longer about adding faster processors to an existing server room. It is about new design principles that start with energy efficiency, include cooling as an integral part of the system and end with an operating model that allows growth without wasting resources.

“The development of AI technologies requires a new approach to IT infrastructure design. At HPE, we believe that technologies such as liquid cooling are the key to efficient and sustainable data centre development to meet the demands of the future.” – Karolina Solecka concludes.

Share This Article