- The carbon footprint of server farms used to power generative AI could be problematic.
- AI requires significant amounts of computing power, which consumes electricity at a large scale and results in CO2 emissions.
In a world where AI is ubiquitous, the environmental impact could be significant.
A study estimated that training a single large language model produces about 300,000kg of CO2 emissions.
The inference phase of AI, where users interact with the technology, also has a carbon footprint.
Generative tasks in AI, such as text and image generation, are more energy- and carbon-intensive compared with discriminative tasks.
Training AI models remains more carbon-intensive than using them for inference.
A study estimated that a large AI model would need 204.5 million inference interactions to double its carbon footprint.
The hope is that generative AI technology will decline in popularity, reducing its impact on the planet.