twitter
youtube
instagram
facebook
telegram
apple store
play market
night_theme
ru
arm
search
WHAT ARE YOU LOOKING FOR ?






Any use of materials is allowed only if there is a hyperlink to Caliber.az
Caliber.az © 2025. .
WORLD
A+
A-

How Big Tech’s energy demands could reshape our future AI’s growing appetite

27 August 2025 03:32

Artificial intelligence has earned a reputation as an energy-hungry, water-thirsty technology, with critics warning it could become the next big climate villain.

Reports that a single question to ChatGPT consumes 10 times more electricity than a Google search have added to fears. But in a recent Washington Post analysis, Michael J. Coren, Climate Coach advice columnist for The WP, asks whether AI really deserves its image as an environmental monster, and what the data actually tells us.

He draws a comparison to a similar panic five years ago, when a French think tank claimed that streaming a half-hour Netflix show produced as much carbon dioxide as driving four miles. That estimate, widely cited at the time, proved wildly inflated. Subsequent analysis from International Energy Agency experts showed emissions were 25 to 53 times lower. Coren suggests we may be in a similar moment with AI: grappling with headline-grabbing numbers that may not capture the full picture.

The numbers: Falling, but still murky

AI queries do consume more power than conventional search, though estimates vary. In 2023, researcher Alex de Vries calculated that a ChatGPT request used 23–30 times the energy of Google search — about 7 to 9 watt-hours (Wh), the same as keeping a lightbulb on for an hour.

The International Energy Agency later put the figure at 2.9 Wh, or roughly 10 times a Google search. Most recently, AI research firm Epoch AI suggested just 0.3 Wh per ChatGPT query, and Google confirmed Gemini responses are slightly lower still.

The wide discrepancies highlight both the lack of transparent, independently verified data and the rapid efficiency gains being made. Google claims its AI emissions have fallen by a factor of 44 in just a year.

For individual users, the climate impact is negligible: eight daily AI queries over a year add up to less than 0.003 percent of an American’s annual carbon footprint. Even water consumption, another source of alarm, is minimal compared with agriculture: ChatGPT uses about one-fifteenth of a teaspoon of water per response, while a hamburger requires 660 gallons to produce.

Where the real problem lies

The problem, Coren emphasises, is not individual use but systemic adoption. AI is being integrated into nearly every aspect of business, governance and daily life — from customer service calls to healthcare to military operations. More advanced use cases, particularly AI-generated video, can be enormously power-intensive: a five-second AI clip consumes nearly 1,000 Wh, equivalent to an e-bike ride of almost 40 miles.

This surge in demand is driving a rapid expansion of data centers. According to Goldman Sachs, US data centers currently account for about 3 per cent of national electricity use but could reach 8 per cent by 2030, largely because of AI.

That means utilities must scramble to bring new power online, often defaulting to fossil fuels. In places like Virginia, home to the world’s largest cluster of data centers, residents may face dirtier air and higher utility bills as energy demand soars.

Coren notes that advances in hardware and model design have slashed AI’s per-query energy costs, with MIT’s Vijay Gadepally estimating efficiency tweaks could cut global data center emissions by as much as 80 per cent in some cases. Yet history shows that efficiency often leads to greater overall consumption, a phenomenon known as the Jevons Paradox.

Just as improvements in steam engines fueled more coal use in the Industrial Revolution, cheaper AI services may trigger an explosion of new applications, offsetting efficiency gains.

Future powered by AI and fossil fuels?

Industry leaders seem undeterred. OpenAI’s $500 billion “Stargate Project” aims to build up to 20 new data centers, one with a 360-megawatt natural gas plant attached. Microsoft CEO Satya Nadella has predicted AI will become a commodity we can’t get enough of.

OpenAI’s Sam Altman has even claimed that “intelligence too cheap to meter” is within reach — echoing the 1954 nuclear energy promise that electricity would someday be so abundant it wouldn’t need to be billed, a forecast that never came true.

For individuals, Coren concludes, AI is a minor contributor to personal emissions compared with choices about diet, commuting and home energy use.

But collectively, the technology is reshaping global energy demand. The question is not whether AI is more efficient than before — it clearly is — but whether society can expand its use without locking itself into a dirtier energy future.

By Sabina Mammadli

Caliber.Az
Views: 176

share-lineLiked the story? Share it on social media!
print
copy link
Ссылка скопирована
ads
WORLD
The most important world news
loading