The Carbon Reckoning: AI's Environmental Footprint Could Exceed All Current Data Centers by 2028

Data centres are consuming 200+ terawatt-hours annually along with carbon emissions. Is AI to blame? Analysts say yes.

The Carbon Reckoning: AI's Environmental Footprint Could Exceed All Current Data Centers by 2028
Photo by Kouji Tsuru / Unsplash

Artificial intelligence has emerged as one of the most transformative technologies of our time, revolutionizing industries from healthcare to finance. Yet beneath the promise of innovation lies an inconvenient truth around the fact that AI's computational demands are creating an unprecedented strain on global energy systems and environmental resources.

As billions of users engage with chatbots, image generators, and AI-powered applications daily, the energy consumption and carbon emissions generated by these systems pose a critical challenge to climate goals.

This report examines the multifaceted environmental impact of AI infrastructure, from the electricity powering data centers to the water cooling server farms and the embodied emissions locked into hardware manufacturing.


The Scale of AI's Energy Consumption

The rise of generative AI has fundamentally altered global energy demand patterns. In 2024, data centers in the United States consumed approximately 200 terawatt-hours of electricity, roughly equivalent to Thailand's annual electricity consumption.

More alarming are projections for the coming years: between 2024 and 2028, AI-specific electricity consumption is expected to surge to between 165 and 326 terawatt-hours annually, representing an expansion that would exceed all current US data center electricity use for all purposes combined.

This staggering growth could supply electricity to 22% of American households, yet would be devoted almost entirely to artificial intelligence operations.

The computational intensity of modern AI systems cannot be overstated. Generative AI applications require approximately 33 times more energy to complete tasks compared to traditional software.

When a user submits a text prompt to systems like ChatGPT, that seemingly simple interaction triggers massive computational chains across distributed data centers. Google estimates that each median text prompt to its Gemini system consumes 0.24 watt-hours of energy, a figure that has improved dramatically through optimization efforts. However, multiplied across billions of daily queries globally, these individually small demands aggregate into staggering totals.

The infrastructure supporting this compute is equally demanding. New AI data centers require between 100 and 1,000 megawatts of power, equivalent to the electricity demands of medium-sized cities.

Companies like Google, Microsoft, and Amazon have accelerated their data center expansion programs at historic rates. These facilities operate continuously, consuming power regardless of whether systems are actively processing queries or sitting idle waiting for traffic spikes.

Carbon Emissions and Grid Challenges

The relationship between electricity consumption and carbon emissions depends heavily on how that electricity is generated. This is where regional differences matter profoundly. In the United States, fossil fuels still provide nearly 60% of electricity supply, with natural gas and coal dominating the mix.

While nuclear accounts for approximately 20% and renewables comprise the remainder, the transition toward cleaner energy has not kept pace with AI's explosive growth.

The carbon intensity of AI's electricity footprint reveals a troubling pattern. Large AI models like GPT-3 generated roughly 500 metric tons of carbon dioxide during training alone, equivalent to driving a car from New York to San Francisco approximately 438 times.

Current estimates suggest AI-related emissions exceed 300 million tons annually and are projected to grow significantly throughout this decade. The International Energy Agency estimates that data centers currently account for 1–1.5% of global electricity use and approximately 1% of worldwide energy-related carbon dioxide emissions. Of this, AI is responsible for roughly 15%, a figure that continues rising.

Major technology companies have witnessed dramatic increases in their carbon footprints as AI workloads multiply. Google's emissions rose 48% over the five-year period ending in 2024, while Microsoft's grew 23.4% since 2020, with AI and cloud computing cited as primary drivers.

Companies frequently claim "carbon neutrality" through purchased clean power credits, yet their actual operational emissions in specific regions often remain unreported. This lack of transparency obscures the true environmental impact.

The grid stress from AI infrastructure deployment creates additional problems. When regional utilities cannot supply adequate clean electricity, some have resorted to restarting retired coal plants to meet data center power demands.

In one notable case, Elon Musk's X supercomputing center near Memphis was documented via satellite imagery using dozens of methane gas generators to supplement grid power, raising clean air concerns.


Water Consumption and Environmental Justice

Beyond carbon emissions, AI infrastructure poses a severe water consumption challenge. Data centers require substantial water for cooling servers that generate enormous amounts of waste heat.

US data centers consume approximately 7,100 liters of water for each megawatt-hour of energy they process. Google's US data centers alone consumed an estimated 12.7 billion liters of fresh water in 2021.

Projections paint an alarming picture for water resources. Between 2024 and 2030, AI server deployment across the United States could generate annual water footprints ranging from 731 million to 1.125 billion cubic meters, depending on expansion scenarios. In regions already experiencing water stress due to climate change, this additional demand threatens both local communities and agricultural systems.

The problem becomes particularly acute in arid regions where data centers are often sited due to real estate costs and cooling availability, directly competing with human consumption and irrigation needs.

This creates an environmental justice dimension. Ratepayers in areas hosting new data centers often bear increased electricity costs without directly benefiting from AI operations.

A 2024 Virginia legislature estimate suggested average residential ratepayers could face additional charges of $37.50 monthly due to data center energy costs. Communities near infrastructure developments face air quality impacts from power generation and cooling systems, while receiving minimal compensation or community benefit agreements.

The Hardware Manufacturing Footprint

A frequently overlooked aspect of AI's environmental impact involves embodied emissions—the carbon emissions embedded in the manufacture, transportation, and disposal of physical hardware.

The production of graphics processing units (GPUs) represents a significant portion of this footprint. The fabrication process for GPUs is substantially more energy-intensive than for simpler processors, and involves mining toxic materials and processing with hazardous chemicals.

GPU shipments to data centers increased from 2.67 million units in 2022 to 3.85 million in 2023, with further growth expected.

Embodied emissions from data center construction, including concrete, steel, and IT hardware can represent one-third to two-thirds of overall lifetime facility emissions.

Scope 3 greenhouse gas emissions from data centers, encompassing embodied components, demonstrate that operational energy comprises only part of the total climate impact.

Material extraction, component transportation, facility construction, and eventual hardware decommissioning all generate measurable emissions. As operational energy sources become cleaner through renewable energy adoption, embodied emissions become proportionally more significant in the overall carbon calculation.

The e-waste problem compounds as hardware reaches end-of-life. Rapid deployment cycles mean servers and GPUs face obsolescence while still physically functional.

Responsible recycling and recovery of valuable materials requires infrastructure investments that remain inconsistent globally, particularly in developing nations where much e-waste is currently shipped.


Geographic Disparities and Carbon Intensity

The location of AI training and inference operations significantly influences environmental impact. Analysis of 369 generative AI models released between 2018 and 2024 reveals critical geographic patterns. The United States and China host the majority of large-scale AI systems, yet these regions carry carbon intensities of 0.379 and 0.544 kilograms of CO2 per kilowatt-hour, respectively.

In contrast, nations like Sweden and the United Kingdom, with carbon intensities of 0.05 and 0.27 kg/kWh respectively, host minimal large-scale AI development despite having substantially cleaner grids.

This geographic concentration means that higher computational loads occur precisely where grid electricity carries the highest carbon content. Shifting AI training and inference operations to low-carbon regions could dramatically reduce emissions, yet business incentives, infrastructure availability, and geopolitical considerations often override environmental optimization.

China alone generates 54.4% of global AI-related carbon emissions despite not necessarily having the highest computational loads, a disparity driven primarily by regional grid carbon intensity.


Industry Responses and Efficiency Gains

Major technology companies acknowledge AI's sustainability challenges and have implemented various mitigation strategies. Google reports achieving a 33-fold improvement in energy efficiency and a 44-fold reduction in total carbon footprint for median Gemini prompts over a 12-month period, driven by hardware innovations, software optimization, and data center efficiency improvements.

Microsoft has committed to carbon-negative operations by 2030 through renewable energy procurement and carbon offset programs. AWS aims for 100% renewable energy by 2025 and provides sustainability calculation tools for customers.

These improvements stem from innovations across multiple domains. Chipmakers have developed architectures incorporating more memory directly onto chips and hard-wiring basic calculations, achieving efficiency gains reaching 96% in some cases.

Server designs have been optimized to minimize internal data transfers, reducing unnecessary energy consumption. Power usage effectiveness metrics, measuring the ratio of total data center power to computing power have improved substantially through better cooling design and operational optimization.

Corporate power purchase agreements with renewable energy providers represent another strategy. Technology companies had contracted over 35 gigawatts of clean electricity by the end of 2022, with additional agreements announced subsequently.

Google Cloud matches 100% of operational energy with renewable energy purchases, while Azure commits to large-scale renewable procurement to achieve decarbonization targets.

However, these efficiency gains must be contextualized against rapidly expanding absolute demand. Improvements in per-query emissions mean little if total query volumes grow by orders of magnitude, potentially negating or reversing aggregate progress.


Regulatory and Accountability Frameworks

Regulation of AI's environmental impact remains nascent. In early 2024, bicameral lawmakers introduced the Artificial Intelligence Environmental Impacts Act, seeking to establish EPA-led studies of AI's environmental footprint and develop measurement standards through NIST.

A January 2025 Executive Order directed the Department of Energy to draft comprehensive reporting requirements for AI data centers covering entire lifecycle impacts, including embodied carbon, water usage, and waste heat management.

The European Union's forthcoming AI Act requires large AI systems to report energy consumption, resource use, and lifecycle impacts. Internationally, the ISO is developing standards for sustainable AI accounting, focusing on energy, water, and materials metrics. However, most existing regulations address broader data center infrastructure rather than AI specifically.

The lack of mandatory disclosure requirements means that transparency about AI's environmental impact remains incomplete. Many technology companies avoid publicly reporting the specific emissions associated with their AI operations, citing proprietary concerns.

This voluntary disclosure system obscures the true scope of the industry's environmental footprint and complicates efforts to establish meaningful climate accountability.


The Path Forward: Mitigation and Sustainability

Achieving sustainable AI development requires multifaceted approaches addressing both technological and policy dimensions. Energy efficiency must remain paramount, continuing innovations in chip design, software optimization, and data center operations.

Equally critical is accelerating regional grid decarbonization, particularly in areas hosting major AI infrastructure. This requires coordinated investments in renewable energy generation, grid modernization, and energy storage systems to handle AI's dynamic power demands.

Embodied emissions reduction necessitates industry commitments to low-carbon materials for data center construction, modular designs extending hardware lifecycles, and robust recycling infrastructure.

Companies must prioritize developing AI applications that deliver genuine value justifying their energy consumption, while exercising restraint regarding frivolous applications.

Water stress demands location-conscious data center planning, prioritizing regions with water abundance and considering cooling technology alternatives to traditional water-based systems. Environmental justice requires inclusive planning processes engaging affected communities and establishing benefit-sharing mechanisms.

Perhaps most fundamentally, the AI industry must internalize environmental costs within its decision-making calculus. As AI systems become increasingly embedded in global infrastructure, their sustainability cannot remain an afterthought or marketing claim. True decarbonization of AI requires commitment matching the intensity applied to performance optimization.


Fast Facts

How much carbon does a single AI query produce, and is it significant?

Individual AI queries produce measurable but individually small emissions—around 0.03 grams of CO2 equivalent for a text prompt to advanced systems like Google's Gemini, based on recent 2025 estimates. To contextualize, that's approximately equivalent to five drops of water in terms of volume, suggesting individual query impact appears negligible.

Why can't AI companies just use more renewable energy to solve the environmental problem?

While renewable energy adoption helps considerably, it addresses only operational emissions, not embodied emissions from manufacturing and construction. Furthermore, renewable energy development cannot keep pace with AI's explosive demand growth.

Is there a genuine tension between AI innovation and climate goals, or can both be achieved simultaneously?

A real tension exists, though not necessarily an unresolvable one. Larger AI models generally deliver superior performance but demand proportionally greater computational resources. Continuing to scale model size without constraint would likely make climate goals unattainable given current energy infrastructure trajectories.