- AI’s energy consumption is a growing concern, with data centers accounting for 1 to 1.5 percent of global electricity use.
- The boom in AI could significantly increase this figure. If current trends continue, NVIDIA could ship 1.5 million AI server units per year by 2027.
- These servers could consume at least 85.4 terawatt-hours of electricity annually, more than many small countries.
- AI’s energy-intensive nature is highlighted by the example of Google’s search engine. If it were fully turned into a generative AI like ChatGPT, its energy use would spike, requiring as much power as Ireland.
- Sustainability should be considered a risk factor for AI, alongside errors, unknowns of the black box, and AI discrimination bias.
- The energy problem of AI has been approached through hardware optimization, but this is becoming physically impossible. Algorithmic approaches are being explored instead.
Read more at: https://www.theverge.com