Artificial Intelligence

We power AI - from grid to core

The rapid advances in artificial intelligence (AI) applications have drastically increased the energy demand in data centers. This introduces a paramount challenge: improve the scalability of AI technologies while maintaining environmental responsibility. This article provides an overview of solutions to address this challenge.

Edge AI
Energy
Data centers
Article

The importance of data in Artificial Intelligence (AI) cannot be overstated. In fact, Artificial Intelligence, particularly machine learning, relies heavily on vast quantities of data to train its models, which can encompass a wide range of formats, including text documents, images, and sensor readings such as temperature and humidity. By analyzing this data, AI systems can identify patterns and relationships, which they then use to make predictions, decisions, and generate outputs.

The type of data used to train an AI system is closely tied to the specific task for which the AI is being developed. For instance, a text generation AI system, such as a large language model, requires large amounts of text data to function effectively, whereas predictive analytics around road traffic might rely on sensor data to make accurate predictions. Ultimately, the quality and quantity of the data used by an AI system have a direct impact on its accuracy, reliability, and performance.

Data centers are playing a critical role as the backbone of Artificial Intelligence. They must process these immense streams of data around the clock. We see that AI, and generative AI in particular, will accelerate this data growth and this ever-increasing demand for data requires seamless connectivity, higher bandwidth, wide-area coverage, and lots of computational power.

The rapid advancements in artificial intelligence (AI) applications have drastically increased the power demands within data centers. The reason why training and executing AI models requires energy lies in the computing power required for the complex calculations of the machine learning algorithms. The computing power required to train modern AI models has doubled approximately every 3.4 months since 2012. This exponential growth in computing requirements is directly reflected in higher energy requirements and increases the overall load on the network. The more complex the calculations, the more energy is required. This is not just about the power consumed by the processors themselves, but also the infrastructure required to support them, including cooling systems and power supply networks.

Simply put, there is no AI without power.

Technical insights

The training of ever larger AI models in huge data centers requires ever more powerful computing capabilities and a clustering of as many as 100.000 processors into one virtual machine. This will pose challenges at three levels: 

  • Powering modern processors at ever higher load currents and strong transient load steps. We are expecting up to 10.000 Ampere per processor within this decade, a tenfold increase from the requirements seen today.
  • Powering AI server racks at power levels beyond 1 MW. This is again a tenfold increase in rack power from the current state-of-art.
  • Powering entire data centers at power levels in the GW scale, requiring a different infrastructure and novel ways of power distribution across the data center. Furthermore, as data centers are turning into substantial consumers of electricity, buffering of load profiles and provision of ancillary grid services become a necessity.

Read our whitepaper in which we share our insights on some likely scenarios in the future of AI power management. We examine how changes in architecture, quality, efficiency, thermal requirements, and energy availability will shape the landscape. Our analysis aims to provide a clear understanding of the critical trends in this evolving field.

Read the complete document at www.infineon.com/aipredictions

Behind the brilliance of AI lies a computationally and power-intensive process - with a staggering carbon footprint. As we expand AI capabilities, we need to be aware of the massive energy consumption of AI data centers. Additionally, AI data centers require extensive cooling mechanisms to prevent overheating, which often leads to substantial water usage. The water consumption for cooling these massive data centers can rival that of small cities, putting additional pressure on local water resources.

Data centers
Data centers
Data centers

Given these dynamics, we must focus on enhancing energy efficiency. This means for example developing more efficient AI algorithms or optimizing data center infrastructure by implementing innovative and energy efficient power management solutions that significantly reduce power delivery network losses. Addressing these challenges is essential for environmental sustainability and ensuring the economic viability of scaling up AI technologies.

In this episode of Podcast4Engineers, host Peter Balint explores the intricate demands of powering AI with guest, Fanny Bjoerk, Director Global Application Marketing for Datacenter Power Distribution. They discuss the exponential rise in power requirements of hyperscale data centers driven by the AI revolution, and the critical role of renewable energy in meeting these demands sustainably. The episode delves into the challenges and solutions for integrating renewable energy sources.

Data centers
Data centers
Data centers

There are many possible solutions possible to this new challenge. What is key: Solutions must cover energy conversion   from the grid entering the data center to the core, the AI processor. 

Our innovative portfolio of power semiconductors includes solutions ranging from the grid entering the data center to its core, the AI processor.  Examples of such applications include top-of-the-rack switches, power supply units, battery back-up units, DC-DC networking and computing like 48 V Intermediate Bus Converters, and protection. Additionally, with our novel power system reliability modeling solution data centers can maximize power supply reliability and uptime, enabling real-time power supply health monitoring based on dynamic system parameter logging.

Data centers
Data centers
Data centers

Worldwide, energy savings of around 48 TWh could be achieved with various types of these advanced power semiconductors. This corresponds to more than 22 million tons of CO₂ emissions, according to Infineon analysis.

Looking into the future there are many technological challenges ahead we need to address all while continuously enhance energy efficiency and performance. We need to foster bringing clean and reliable energy to the AI data centers. It is about enabling the sustainable growth of AI technologies in a way that is compatible with our environmental responsibilities. After all, there’s no AI without power. This reality drives us to keep advancing our technologies, ensuring that as AI evolves, our solutions for powering it efficiently and effectively evolve as well.

Let’s partner to build the solutions of the future – Your career at Infineon

At Infineon, we power AI – from the grid to the core. Our cutting-edge semiconductor solutions enable efficient, scalable, and sustainable power delivery for hyperscale computing, datacom, and telecom applications. Ready to join the journey? Find more information on your career in Powering AI here.

But that’s not all. Infineon is an exciting home to all AI talents worldwide. Have a look at our latest job opening for Artificial Intelligence, Machine Learning and Data Science and many more in related fields.

Man with a tablet in ai data center
Man with a tablet in ai data center
Man with a tablet in ai data center