Unpacking AI’s energy doomsday narrative.

TLDR:

  • Recent articles from Bloomberg and The Washington Post raise concerns about AI’s energy usage.
  • However, the focus should be on data centers as a whole, not just AI.

AI’s supposed energy apocalypse has been making headlines, with projections from Goldman Sachs and the International Energy Agency (IEA) painting a grim picture of AI’s impact on our power grid. While AI models do consume a significant amount of energy, it’s important to note that data centers as a whole were already growing in size and energy usage before the rise of AI.

The crux of the issue lies in the fact that most of the energy usage growth in data centers occurred before the AI boom. While generative AI tools do contribute to energy consumption, they are just a small part of the overall energy usage of data centers. Dutch researcher Alex de Vries estimated that by 2027, the AI sector could use up to 134 TWh of power, representing about 0.5% of the world’s electricity demand.

Comparing these numbers to other common uses of electricity, such as PC gaming, shows that AI’s energy consumption is not excessively high. Additionally, the projected energy usage of data centers as a whole far exceeds that of AI alone. The key takeaway from this analysis is that while AI does consume energy, it is not the sole culprit in the rising energy demands of data centers.