The New Era of AI and its Impact on Data Centres

With Data Centres Serving as the Critical Infrastructure Supporting the AI Ecosystem, Innovative Solutions Are Needed to Tackle Sustainability Challenges

With digital transformation across sectors gaining momentum, and with the rise in power-intensive AI applications, the demand for data services globally is rising exponentially. 

The International Energy Agency states that data centres account for around 1% of the global electricity demand. By 2030, data centres are expected to reach 35 gigawatts of power consumption annually, up from 17 gigawatts in 2022, according to McKinsey.

As explained by Marc Garner, SVP Secure Power Europe at Schneider Electric, AI has emerged as a transformative force, changing the way we process, analyse, and utilise data. 

“With the AI market projected to reach a staggering US$407bn by 2027, this technology continues to revolutionise numerous industries, with an expected annual growth rate of 37.3% between 2023 and 2030,” he tells us. 

“The AI market has the potential to grow even more, thanks to the boom in generative AI (Gen AI). 97% of business owners believe that ChatGPT will benefit their organisations, through uses such as streamlining communications, generating website copy, or translating information, but the surge in adoption will undoubtedly require greater investment and infrastructure for AI-powered solutions than ever.” 

Accommodating the demands of this new AI-powered world brings with it challenges. 

“Data centres serve as the critical infrastructure supporting the AI ecosystem,” Garner says. “Although AI requires large amounts of power, AI-driven data analytics can help bring data centres closer to net zero and play a positive role in tackling the sustainability challenge.”

Here, Garner explores the four key AI attributes and trends that underpin the physical infrastructure challenges of data centres: power, racks, cooling, and software management.

How to tackle increasing power-hungry AI applications

As Garner explains, power, cooling, racks and physical infrastructure are core to a data centre’s success. 

“Storing and processing data to train machine learning (ML) and large language models (LLMs) is steadily driving up energy consumption,” he says. “For instance, researchers estimate that creating GPT-3 consumed 1,287 megawatt hours of electricity and generated 552 tons of CO2 — the equivalent of 123 gasoline-powered passenger vehicles driven for one year. What’s more, data centres are adopting high-density racks that can accommodate a larger number of servers in a smaller space, further driving up power requirements.

“So how do we meet these increased power demands of AI, whilst minimising its impact on the planet? Data centres are continually evolving to accommodate the increased power demands of AI clusters. Improving power distribution systems and energy efficiencies within data centres helps to minimise losses and ensures that power is delivered to servers in the most efficient way possible. As operators design and manage data centres, they must focus on energy-efficient hardware and software, while diversifying power sources to provide the secure and plentiful power AI needs to thrive. 

“Additions such as advanced power distribution units (PDUs), intelligent management and high-efficiency power systems, alongside renewable energy sources, enables data centres to reduce both energy costs and carbon emissions. However, the extreme rack power densities of AI training servers can create additional issues than power consumption — cooling, for example, can also create complex challenges for operators.”

The transition from air-cooling to liquid cooling is a must to increase sustainability

Today sustainable and resilient data centre design hinges on effective cooling. The demands that AI places on data centres mean powering high-density servers requires new cooling methodologies for both optimal performance and minimised downtime, Garner says. 

“Although air cooling is commonplace in the industry and will still exist for years to come, a transition from air cooling to liquid cooling will become the preferred and necessary solution for data centres to cope with AI clusters effectively. This is due to traditional air-cooling systems becoming less efficient for high-density setups. 

“Here Direct-to-Chip liquid cooling, where a cooling fluid is circulated through the servers to absorb and dissipate heat, is quickly gaining popularity as a more effective way to handle the concentrated heat generated by AI clusters,” he adds.

“Compared with air cooling, liquid cooling provides many benefits for data centres. From improved processor reliability and performance to space savings with higher rack densities to more thermal inertia with water in piping, liquid cooling increases energy efficiency, improves power utilisation, and reduces water usage.”

Turning the technology on itself

Another way for data centre leaders to cope with the increasing demands of AI is to use the technology to their advantage. 

“Data centres can benefit from using AI-powered automation, data analytics and machine learning to find opportunities for efficiency gains and decarbonisation,” Garner asserts. “By using data insights more efficiently, we can drive new, more sustainable, behaviours.

“This process is powered by physical infrastructure and software tools that support data centre design and operation, including DCIM, EPMS, BMS, and digital twins. These applications decrease the risk of unexpected behaviour with complex electrical networks and provide a digital replica of the data centre to identify constrained power and cooling resources to inform layout decisions.”

For instance, Equinix improved the energy efficiency of its data centre by 9% using AI-based cooling, which enabled the company to reduce the energy consumption of cooling systems by regulating them more effectively and making the system more efficient.

Achieving more computing power within the same physical footprint

As Garner concludes, what’s clear is that AI applications are escalating power consumption in data centres at a time when they need to become more sustainable. Yet, AI is also providing the intelligence to design and operate data centres in a smarter, more energy-efficient way, and if deployed correctly, can help the planet’s journey toward net zero.

“By combining the key attributes of data centre physical infrastructure with the efficiency gains of AI, owners, operators and end-users can more effectively manage the power demands of high-density AI clusters while maintaining efficiency, reliability and sustainability,” he says.

******

Make sure you check out the latest edition of Technology Magazine and also sign up to our global conference series - Tech & AI LIVE 2024

******

Technology Magazine is a BizClik brand ​​​​​​​

Share

Featured Articles

Microsoft & Alphabet: AI and Cloud Strategy Driving Success

Tech giants Microsoft and Alphabet are going all in on AI and cloud computing, investing billions to develop powerful models and platforms

Vodafone’s Maria Grazia Pecorari joins Tech & AI LIVE London

Maria Grazia Pecorari, Director of Strategy and Wholesale at Vodafone UK to speak at Tech & AI LIVE London

How Alteryx Aims to Bring Data Analytics Skills to All

With digital leaders citing skills shortages as a major business obstacle, Alteryx has announced partnerships to tackle the data and analytics skills gap

Ivanti’s David Shepherd joins Tech & AI LIVE London

Digital Transformation

Dell Technologies: Firms Expect AI to Transform Industries

AI & Machine Learning

Top 100 Women 2024: Robyn Denholm, Tesla - No. 8

AI & Machine Learning