Why Artificial Intelligence Is Not Green—Yet

06.27.2025

By Nurul Rakhimbek, President, Center for Global Civic and Political Strategies

Artificial Intelligence (AI) is often portrayed as something abstract and immaterial, an intangible marvel floating in the “cloud.” However, this perception is deceptive. Every generated response or algorithmic suggestion is landing in a highly material reality: massive data centers that consume vast amounts of electricity, cooling systems that exhaust freshwater supplies, and mining operations that extract limited natural resources like fossil fuels, which are used to generate electricity for powering AI. This means AI indirectly contributes to greenhouse gas emissions, intensifying climate change. AI is far from being called environmentally neutral. At best, it carries a soft environmental footprint, pink, perhaps. At worst, it bleeds deep red, marked by carbon emissions, resource depletion, and ecological damage.

At the heart of this concern is energy consumption. Training a GPT-4 model is not a trivial feature any longer, it means months of continuous computing on GPU clusters, drawing power comparable to small cities. One 2020 study from the University of Massachusetts Amherst estimated that a single large model training run emits around 284,000 kg of CO₂, roughly the lifetime emissions of five gasoline vehicles. And that excludes everything that happens after inference, re-training, scaling, etc. In North America, data center demand doubled in 2023, from 2,688 to over 5,300 MW, largely driven by AI. Globally, data center use hit 460 terawatt-hours in 2022, and projections show it could double again by 2026, making these data centers among the world’s most voracious energy consumers.

AI’s ingestion of energy continues well after training. Each AI-powered action, voice assistants, real-time sensors in autonomous cars, and image generators trigger bursts of computation. A single ChatGPT query, for example, can consume five times more energy than a standard web search. Multiply that by 164 million monthly users, and we’re looking at emissions equivalent to over 260 transatlantic flights every month.

And that’s just electricity. Cooling these clusters demands water. In 2023, Google consumed 6.1 billion gallons of potable water across its data centers, equivalent to 41 golf courses in dry regions of the U.S. Training GPT‑3 alone once required 700,000 liters of water, about the same as manufacturing 320 Tesla EVs.

 

But physical resources don’t stop there. AI hardware depends on critical minerals such as lithium, cobalt, and nickel, sourced through mining operations that are often environmentally destructive and entangled in complex social and geopolitical challenges, from the Democratic Republic of Congo to Kazakhstan. Each AI chip might need 1,400 liters of water and 3,000 kWh of electricity just in its manufacture. AI hardware becomes outdated quickly, contributing to electronic waste. The disposal and recycling of electronic waste frequently take place in low-income countries where environmental regulations are weak or poorly enforced, leading to widespread toxic pollution. Fast-paced innovation and short hardware lifespans drive constant equipment turnover, contributing to a mounting e-waste crisis: in 2022 alone, 62 million tonnes were generated, yet only 25% was properly recycled. Much of the remainder was exported to poorer regions, where improper handling contaminates soil, air, and water.

Another emerging challenge is grid stability. In some regions, AI clusters are pushing local utilities to capacity. In response, some municipalities have begun placing moratoriums on new data center permits, citing concerns that local power grids are reaching their limits, underlining the growing friction between rapid technological expansion and the capacity of public infrastructure.

The ecological impacts of AI infrastructure also extend beyond energy use. The construction and operation of data centers, AI research hubs, and semiconductor fabrication plants require significant land, often resulting in habitat loss, deforestation, and soil degradation. Additionally, the transportation of hardware components and cooling materials—such as water or liquid nitrogen—generates substantial indirect emissions through global supply chains.

Yet all this unfolds under a cultural ethos: bigger is better. Bigger models, more parameters, more compute. And while performance metrics, accuracy, speed, and bench tests abound, environmental metrics are largely neglected. Few AI benchmarks account for emissions, energy, or water use, leaving these costs invisible to both consumers and investors.

But AI’s energy use is not fixed. Innovation plays a critical role in reducing resource dependence. Advances in hardware efficiency, improved cooling technologies, and smarter algorithms can lower energy consumption per task. Furthermore, shifting AI’s power supply toward clean energy sources—such as nuclear, renewables, and next-generation natural gas—can significantly mitigate its environmental impact. The first step is clarity: AI today is not green. Not yet. It is pink or red, and growing redder. And unless we rewrite its narrative, our digital future will inherit a scarred planet.

Previous
Previous

Russia at a Civilizational Crossroads: The Rise of the Middle Turkic Order and the Crisis of Imperial Identity

Next
Next

Managed Conflicts as a Phase of Hybrid Stability: Geopolitics and Tectonopoly