By Alexandra Costo

When we open an app, upload a photo, or stream a video, we rarely think about the infrastructure that makes it possible. The “cloud” may sound weightless, but it depends on vast networks of data centers, including industrial facilities packed with servers, storage devices, and cooling systems that run around the clock. These centers now power artificial intelligence (AI), streaming, and digital services central to finance, healthcare, and education.

Rapid advances in AI have sharply increased the volume and complexity of data processed, driving unprecedented electricity demand. Training OpenAI’s GPT‑4, for example, is estimated to have used about 50 GWh of electricity, according to MIT Technology Review. This is roughly equivalent to the annual electricity consumption of 4,000 U.S. homes, according to the Environmental Protection Agency. This scale of energy use is not an outlier: newer, larger AI models consume exponentially more power than earlier generations and could soon drive data‑center energy demand to unprecedented levels.

Surging AI Demand and Strain on Infrastructure

Between 2010 and 2018, demand for digital services soared as online platforms and cloud‑based tools became widely adopted. Data traffic grew fivefold, yet energy use rose only from 92 to 130 TWh thanks to major efficiency gains. More energy‑efficient servers, virtualization (software that lets multiple tasks run on a single machine), and massive data centers pooling workloads for greater efficiency enabled this surge in activity without a comparable rise in energy use. While modest compared with the traffic growth, 130 TWh is still roughly equal to the annual electricity consumption of a mid-sized industrialized country such as Argentina, which has around 45 million people and a relatively high level of urbanization.

More recently, between 2018 and 2023, demand surged again, largely due to AI, reaching 300–380 TWh. AI workloads rely on specialized graphics processing units (GPUs) to train and run large language models; these GPUs draw far more power than traditional servers, run almost continuously, and require heavy cooling. The International Data Corporation estimates total data‑center consumption at 350 TWh in 2023, with AI accounting for about 10% of that. By 2028, total consumption is projected to more than double, and AI’s electricity use is expected to quadruple—making AI the fastest‑growing source of data‑center electricity demand, outpacing services like video streaming, cloud storage, and social media. This growth is concentrated among major tech firms, including Microsoft and Google, and has contributed to their emissions rising by over 20% and nearly 50% since the start of this decade, despite their net‑zero pledges.

In response to surging demand for AI services, governments and corporations are racing to scale up infrastructure. In the U.S., CHIPS and Science Act —enacted by the Biden administration—directed tens of billions toward semiconductor manufacturing and high‑performance computing projects such as Frontier and Aurora, which require massive energy‑intensive facilities. Analysts project that investments like these could triple data‑center power demand by 2030. To meet this surge, the Trump administration issued an executive order in July 2025 fast‑tracking permits for large data centers (defined as projects adding 100 MW or more) and prioritizing “dispatchable” power sources—natural gas and coal plants capable of quickly ramping output. While the order aims to accelerate data‑center construction and opens federal land for new projects, it also shortens environmental reviews, raising concerns about oversight and long-term ecological impact.

Most of this electricity flows through regional grids such as PJM in the Mid‑Atlantic and ERCOT in Texas, which deliver power from multiple plants to homes and businesses. These grids still source about 60% of their electricity from fossil fuels, according to the Energy Information Administration.  Globally, the International Energy Agency’s (IEA) 2025 Energy and AI report projects that fossil fuels will continue to supply over 40% of new data‑center electricity through 2030, while renewables—largely from corporate power‑purchase agreements and on‑site projects—are expected to cover about half of added demand.  Even with this growth, data‑center emissions are forecast to peak at around 320 million metric tons of CO₂ by 2030—roughly comparable to the current annual emissions from the global aviation sector. For a sector using 3% of global electricity, that’s a disproportionately large footprint. And while the aviation sector has begun implementing decarbonization strategies, such as sustainable aviation fuels and efficiency improvements, similar commitments in the data-center sector remain uneven. Without a faster shift to cleaner power sources, emissions could surpass that of 2030 or arrive there earlier than expected.

Who Bears Responsibility?

New technologies and data centers do not emerge in isolation; their creation and deployment must be financed, permitted, and approved. In the U.S., corporate strategies and earlier federal incentives, such as those under the CHIPS and Science Act and the recent executive order, have accelerated data‑center growth. Although these steps will likely support AI‑driven demand, they will also intensify our dependence on fossil fuels.

The IEA’s 2025  Energy and AI report finds that the United States, Europe, and China together consume about 85% of global data‑center electricity, fueled by concentrated tech sectors, large investment capacity, and early AI adoption. In the United States alone, data‑center usage reached about 180 TWh in 2024, nearly 45% of global consumption and just over 4% of total U.S. electricity demand. This figure is projected to rise by more than 30% to 240 TWh by 2028, reaching about 12% of national power use, roughly comparable to the entire U.S. commercial sector’s electricity demand today, according to the Department of Energy‘s 2024 report on U.S. Data Center Energy Use.

Solutions: Reporting, Efficiency, and Clean Energy

Identifying responsibility is straightforward: companies and governments hold power, while communities bear the burden of rising emissions. The potential of AI and cloud computing is equally clear: medical breakthroughs, scientific research, and global communication. The challenge lies in managing the energy demand these benefits create. Governments set permitting rules, provide subsidies, and regulate emissions; corporations decide how to design and operate facilities. Both choices shape the overall footprint and determine how quickly the sector can decarbonize.

Governments and consumers should push companies to adopt energy‑saving measures by demanding transparent emissions reporting. Many operators are already redesigning facilities with liquid cooling, which uses less energy than traditional air‑based systems, and deploying AI to manage workloads so servers run only when needed. Yet the most impactful—and also the most challenging—solution is expanding clean energy on the grid to meet AI‑driven demand.

Emissions Reporting and Efficient Practices

Reducing the climate impact of data centers begins with understanding where emissions occur. The GHG Protocol groups them into three scopes: Scope 1 (direct emissions from owned sources like backup generators), Scope 2 (indirect emissions from purchased electricity), and Scope 3 (all other indirect emissions, including construction and server manufacturing). Scope 3 can account for about 70% of a company’s total emissions, according to the World Economic Forum, yet disclosure remains inconsistent. Many major data-center operators either omit Scope 3 entirely or report only broad estimates that exclude construction and equipment manufacturing.

Clean‑energy accounting is another challenge. Most companies rely on renewable energy credits (RECs), which measure renewable generation over the course of a year rather than when the power is consumed. This allows companies to claim “100% renewable” even if they draw fossil power during off‑peak hours and offset it later. A stronger alternative is 24/7 carbon‑free energy (CFE), which matches every hour of consumption with carbon‑free generation. By aligning energy use with real‑time renewable supply, CFE can reduce dependence on offsets, drive investment in storage and renewables, and encourage efficiency upgrades.

Combining hourly CFE tracking with detailed Scope 3 reporting through stronger disclosure rules would give customers and investors a clearer view of which companies are genuinely cutting emissions. Equinix, the world’s largest data‑center operator, offers a model: its 2024 sustainability report discloses 1.4 million metric tons of Scope 3 emissions and sets a goal to reduce them by 90% by 2040. To reach this target, Equinix partners with suppliers to share product‑level carbon data, adopt science‑based targets, and prioritize low‑emission building materials and “dematerialized” site designs that use fewer physical resources.

Efficiency improvements complement these reporting efforts. AI‑based load balancing can cut electricity use by up to 30% by activating servers only when needed and routing tasks to efficient hardware, with many upgrades that can often be retrofitted into existing facilities. Cooling offers similar gains: air‑based systems consume about 40% of facility energy, while liquid cooling and free‑air or water‑cooling approaches remove heat more efficiently, operate at higher temperatures, and allow for reuse of waste heat.

Scaling Clean Energy

Efficiency can cut electricity use, while clean energy ensures remaining demand is low‑carbon. However, adopting stricter energy and emissions standards is politically challenging. Such policies raise short‑term costs for companies and consumers, making them unpopular with voters and industry groups, and implementation is complex: tracking emissions requires inter‑agency coordination and reliable data, while clean‑energy mandates demand major grid upgrades.

AI’s environmental footprint would shrink if renewables supplied more grid power,  but this is harder than it sounds. Building new wind and solar projects often takes years due to outdated transmission infrastructure and slow regulatory approvals. Although the Biden Administration’s Infrastructure Investment and Jobs Act of 2021 and Inflation Reduction Act of 2022 invested billions in renewables, storage, and grid modernization, recent policy rollbacks have slowed progress in the U.S. In March 2025, the Securities and Exchange Commission voted to drop its defense of climate‑risk and emission disclosure rules, calling them  “costly and unnecessarily intrusive.” Then, in July 2025, the U.S. Congress passed the “One Big Beautiful Bill,” accelerating the expiration of clean‑energy tax credits. These retreats weaken incentives for companies to sign long‑term clean‑power agreements, invest in new infrastructure, and cast doubt on whether the grid can decarbonize fast enough to meet surging demand.

Even without federal mandates, large universities, municipalities, and corporations can help close the gap by sourcing 24/7 carbon‑free energy or partnering with providers that offer real‑time emissions data. Public awareness and investor pressure can also push companies toward more efficient practices, and widespread adoption of these approaches could accelerate market demand for clean‑energy infrastructure.

Choosing a Sustainable Path

The rise of AI and cloud computing is not inherently unsustainable if supporting infrastructure evolves within environmental limits. Achieving this will require choices made today by governments, companies, institutions, and individuals: modernizing grids, scaling clean energy, and demanding transparency. The question is no longer whether cloud computing will grow, but whether it will grow sustainably.

Reader Question

As AI and cloud computing keep growing, should we slow down expansion to make sure it’s powered by clean energy, or move fast and fix the environmental impact later?

Sign up to become a writer | Join the conversation on Slack