Blog

Jul 31, 2024

Navigating The Looming AI Energy Crunch

Posted by in categories: robotics/AI, sustainability

Brandon Wang is vice president of Synopsys.

The rapid development of AI has led to significant growth across the computing industry. But it is also causing a huge increase in energy consumption, which is leading us into an energy crisis. Current AI models, especially large language models (LLMs), need huge amounts of power to train and run. AI queries require much more energy than traditional searches; for example, asking ChatGPT a question consumes up to 25 times as much energy as a Google search. At current rates of growth, AI is expected to account for up to 3.5% of global electricity demand by 2030, twice as much as the country of France.

We need to address this issue urgently before it becomes unsustainable. If we don’t, the impact could threaten sustainable growth and the widespread adoption of AI technologies themselves. Fortunately, there are a number of pathways toward more energy-efficient AI systems and computing architectures.

Leave a reply