Back to News

Could the Surge in AI Lead to a Worldwide Energy Crisis?

Saturday, May 17, 2025

Could the Surge in AI Lead to a Worldwide Energy Crisis?

The surging energy consumption of AI is turning into a significant issue. It's not limited to rising electricity bills; the broader environmental consequences are concerning, including excessive water use, heaps of electronic waste, and a contribution to increasing greenhouse gas emissions, which we aim to reduce.

As AI systems become more intricate and increasingly integrated into our daily lives, a critical question arises: can we continue this technological surge without harming the planet?

The escalating computational power required for advanced AI is advancing at a staggering rate, reportedly doubling every few months. This is not a gradual increase but an accelerated rise, threatening current energy strategies.

To illustrate, AI's impending energy requirements could soon parallel those of whole countries like Japan or the Netherlands, or sizable US states like California. Such figures underline the immense pressure AI might exert on our infrastructure.

The year 2024 witnessed a 4.3% peak in global energy demand, significantly influenced by AI development, alongside the growth in electric vehicles and enhanced industrial activities.

Back in 2022, technologies like data centers, AI, and cryptocurrency mining collectively used nearly 2% of the world's power—a staggering 460 terawatt-hours (TWh).

By 2024, data centers alone consumed approximately 415 TWh, about 1.5% of total global energy, with a growth rate of 12% annually. AI's share was still modest at approximately 20 TWh, or 0.02% of worldwide consumption, but this is expected to increase considerably.

Looking at projections, the expectations are quite alarming. By the close of 2025, AI data centers worldwide might require an additional 10 gigawatts (GW) of power, surpassing the energy capacity of regions like Utah.

By 2026, global data center power consumption could soar to 1,000 TWh, equivalent to Japan's current usage. And by 2027, AI data centers are anticipated to demand 68 GW, similar to California's total power capacity in 2022.

Heading towards the end of this decade, the anticipated figures are even more staggering. Global data center electricity usage is expected to surge to around 945 TWh by 2030, nearly accounting for 3% of global electricity consumption.

Organizations such as OPEC suggest that by then, data center electricity consumption might even triple to 1,500 TWh. Meanwhile, projections from Goldman Sachs indicate a potential 165% increase in global power demand from data centers compared to 2023, particularly with AI-ready centers seeing their requirements quadruple.

There is a possibility that data centers might account for up to 21% of global energy demand by 2030 when considering the energy required to deliver AI services to users.

When discussing AI's energy use, it mainly divides into two parts: the energy needed for training AI and the energy used when deploying it.

Training large-scale models like GPT-4 demands significant power. For instance, training GPT-3 reportedly used 1,287 megawatt-hours (MWh) of electricity, and GPT-4 is believed to have required 50 times more power.

Though training consumes substantial energy, operating these models daily can consume over 80% of AI's total power usage. It's indicated that each question asked to ChatGPT uses roughly ten times the energy of a Google search (approximately 2.9 Wh versus 0.3 Wh).

With the growing popularity of generative AI, there is a rush to create more powerful, and consequently more energy-demanding, data centers.

This raises a fundamental question: can our existing energy systems handle this new demand? We are already navigating through a mix of fossil fuels, nuclear energy, and renewables. To sustain AI's expanding needs sustainably, we require swift advancements and expansion in our energy generation methods.

Renewable sources such as solar, wind, hydro, and geothermal will be critical. In the US, renewables are projected to rise from contributing 23% of energy in 2024 to 27% by 2026.

Major tech companies are making bold commitments. For instance, Microsoft plans to procure 10.5 GW of renewable energy from 2026 to 2030 specifically for its data centers. AI itself could help manage renewable energy more efficiently, potentially reducing energy use significantly by optimizing energy storage and power grids.

However, renewable energy isn't without its challenges. Variability in sunlight and wind poses concerns for data centers which need consistent power supply. Current battery solutions to balance these variations are pricey and space-consuming. Additionally, integrating large renewable installations into current grids can be intricate and time-consuming.

This is why nuclear energy is becoming an attractive option for some, especially as a stable, low-carbon power source for AI's extensive energy demands. It provides reliable, around-the-clock power, which is highly desirable for data centers. There is growing interest in Small Modular Reactors (SMRs) due to their adaptability and enhanced safety features, with companies like Microsoft, Amazon, and Google exploring nuclear options seriously.

Matt Garman from AWS recently emphasized to the BBC that nuclear power is a "great solution" for data centers, describing it as "an excellent source of zero-carbon, continuous power." He highlighted that future energy is a significant focus for AWS.

"We plan for the long term," Garman stated. "We think ahead in our investments. The world will need innovative technologies, and nuclear is likely to be integral, especially looking a decade ahead."

However, the promise of nuclear power comes with complications. Constructing new reactors is time-consuming, costly, and involves navigating complex regulations. Public perception remains wary due to historical incidents, though modern reactors are notably safer.

The rapid evolution of AI creates a misalignment with the lengthy timelines for developing nuclear power facilities. This could lead to a continued reliance on fossil fuels in the interim, which challenges environmental goals. Additionally, positioning data centers near nuclear sites raises concerns about potential impacts on electricity pricing and stability for consumers.

Besides its power consumption, AI's ecological impact extends further. Data centers generate significant heat, requiring large amounts of water for cooling. Typically, about 1.7 liters of water is used for every kilowatt-hour of energy consumed by a data center.

In 2022, Google's data centers reportedly consumed five billion gallons of water, marking a 20% increase from the previous year. Some predictions state each kWh of a data center's usage may require up to two liters for cooling, suggesting that global AI infrastructure could soon be using water comparable to entire nations like Denmark.

The escalating e-waste problem is another consequence. Due to the rapid advancement in AI technology, particularly with specialized equipment like GPUs and TPUs, older systems get discarded more frequently. AI could lead to an annual e-waste total from data centers reaching five million tons by 2030.

Producing AI chips and other essential components for data centers demands natural resources, impacting the environment through mining operations for key minerals like lithium and cobalt.

Even manufacturing a single AI chip requires over 1,400 liters of water and 3,000 kWh of energy. The drive for new hardware subsequently pushes for more semiconductor manufacturing plants, often resulting in challenges including increased reliance on gas-powered facilities.

Moreover, carbon emissions remain a concern. When AI runs on electricity from fossil fuels, it exacerbates the climate crisis. Training a single large AI model can emit as much CO2 as hundreds of households do annually in the US.

Environmental reports from tech giants reveal AI's growing carbon impact. For example, Microsoft’s yearly carbon output increased by about 40% from 2020 to 2023, primarily due to more data centers for AI. Google reported its greenhouse gas emissions rose nearly 50% over five years, with its AI data centers being a significant factor.

Despite this gloomy scenario, innovative solutions could alleviate the situation.

Improving AI algorithms to make them more energy-efficient is crucial. Techniques like "model pruning" (removing redundant parts of a model), "quantization" (using less precise calculations), and "knowledge distillation" (where simpler models learn from complex ones) are being explored. Creating smaller, tailored AI models that perform specific tasks with less power is also key.

In data centers, strategies like "power capping" (limiting power usage) and "dynamic resource allocation" (real-time allocation of computing power based on availability of clean energy) can significantly enhance efficiency. AI-aware software can also schedule tasks for times when energy is cheaper or cleaner. Moreover, AI technology itself can optimize cooling systems in these centers.

Implementing on-device AI can further reduce power needs. By processing AI tasks directly on devices like smartphones, instead of relying heavily on energy-intensive cloud centers, energy consumption can be drastically reduced, emphasizing efficient rather than powerful chip designs.

Regulation plays a critical role. Governments are beginning to see the necessity of holding AI accountable for its environmental and energy influences.

Establishing clear standards for measuring and reporting AI's environmental footprint is vital. Policies encouraging the production of durable and recyclable hardware are essential in reducing e-waste. Systems like energy credit trading could financially incentivize companies to invest in greener AI infrastructure.

Notably, a recent agreement between the United Arab Emirates and the United States to establish the largest AI campus outside the US in the Gulf underscores AI's growing prominence globally. However, it also highlights the urgent need to prioritize energy and environmental concerns in ambitious initiatives.

While AI promises remarkable advancements, its considerable energy requirements pose significant challenges. Projections for future energy use by AI are striking, aligning with consumption levels of entire nations.

To meet this burgeoning demand, a balanced approach utilizing diverse energy sources is necessary. While renewable energy is favorable long-term, it faces challenges in providing consistent power and scaling rapidly. Nuclear energy, particularly SMRs, presents a stable low-carbon alternative gaining traction among major tech companies, but comes with complexities concerning safety, cost, and timelines.

However, electricity isn't the sole concern. The broader environmental impact of AI—including its water usage for cooling, the e-waste generation due to hardware turnover, and the resources consumed in manufacturing—requires holistic consideration to genuinely mitigate its ecological footprint.

Fortunately, promising innovations and ideas are emerging.

Energy-efficient AI algorithm development, smart power management in data centers, AI-aware software load management, and the move towards on-device AI processing are all promising steps towards reduced energy consumption. Furthermore, the growing discourse surrounding AI's environmental impact is fostering important conversations about regulatory measures supporting sustainability.

Successfully tackling AI's energy and environmental challenges necessitates collaborative efforts from researchers, the tech sector, and policymakers, with urgency.

Prioritizing energy efficiency in AI development, investing in sustainable energy solutions, managing lifecycle hardware responsibly, and instigating favorable policies can unlock AI's tremendous potential without jeopardizing our planet.

The race to advance AI must simultaneously strive for sustainability.

Latest News

Here are some news that you might be interested in.