Artificial Intelligence (AI) is on track to consume more electricity than Bitcoin mining by the end of 2025, potentially accounting for nearly half of global data center energy usage.
Key Findings:
- Rapid Growth in Energy Demand: Research by Alex de Vries-Gao from Vrije Universiteit Amsterdam indicates that AI’s electricity consumption could reach 23 gigawatts (GW) by the end of 2025, comparable to the power usage of the United Kingdom.
- Current Consumption Levels: AI already accounts for up to 20% of data center electricity use.
- Challenges in Data Transparency: Estimating AI’s energy consumption is complex due to limited disclosures from tech companies. De Vries-Gao utilized a triangulation method, analyzing chip production data and industry estimates to assess AI’s power needs.
- Environmental Concerns: The expansion of AI data centers, particularly in the U.S., has led to increased construction of gas and nuclear power plants, raising environmental concerns. Companies like Google and Microsoft report rising carbon footprints linked to AI, though they lack AI-specific disclosures.
- Efficiency vs. Consumption: Despite advancements in AI efficiency, overall energy consumption continues to rise, exemplifying the Jevons Paradox, where increased efficiency leads to higher overall usage.
This trend underscores the need for greater transparency in AI energy usage and a shift towards more sustainable AI development practices.
For more details, read the full article on The Verge: AI could consume more power than Bitcoin by the end of 2025.
📢 Disclaimer
This article is based on information from trusted tech news sources. Techwhatif.com summarizes and adapts these insights to make tech news more accessible to our readers. Full credit goes to the original authors and publications. If you're the content owner and have any concerns, please contact us directly for resolution.