

Artificial Intelligence has become the crown jewel of the digital economy. From OpenAI’s ChatGPT to Microsoft’s Copilot and Google’s Gemini, billions of daily interactions are powered by complex algorithms running on massive AI data centres. Yet behind every intelligent output lies a growing web of physical infrastructure — consuming electricity, water and hardware at an accelerating rate. While investors hail AI as a trillion-dollar opportunity, the real costs — measured in megawatts, dollars and environmental strain — are only beginning to surface.
Training and operating modern AI systems are energy-hungry processes. OpenAI’s GPT-4 reportedly required tens of thousands of Nvidia A100 GPUs, consuming roughly 1.5 gigawatt-hours of electricity during training — equivalent to powering 150 average US homes for a year. Even a single ChatGPT prompt consumes about 0.3 watt-hours — nearly ten times that of a Google search. Multiplied across one billion queries daily, that translates to roughly 300 megawatt-hours per day — enough to power 10,000 US homes.
According to the International Energy Agency, global data centre electricity demand could more than double from 460 terawatt-hours in 2022 to over 1,000 terawatt-hours by 2026, rivalling Japan’s total power consumption. Utilities in the US and Europe warn that surging AI demand may challenge regional grid stability and delay decarbonisation plans.
Behind every teraflop of computation lies another hidden cost: water. A 2023 University of California–Riverside study estimated that training GPT-3 consumed over 700,000 litres of freshwater, equivalent to producing 370 BMW cars or filling an Olympic-sized swimming pool.
Each user interaction adds to that footprint: roughly 500 millilitres of water for every 20–50 ChatGPT prompts, primarily used for evaporative cooling. In water-stressed regions such as Iowa and Arizona, where Microsoft, Meta and Google operate major AI campuses, rising demand is already prompting higher water tariffs and community concern. AI facilities differ from conventional cloud data centres in one critical way — density.
Racks filled with GPUs draw 50–80 kilowatts each, nearly 10 times the power of traditional configurations. This creates exponential, not linear, cost behaviour. Unlike linear growth, hyperscale AI facilities experience steep efficiency drops — from transformer losses to thermal limits — making each incremental megawatt more costly to deploy.
The hidden costs extend far beyond electricity. Each Nvidia H100 GPU costs $25,000–$40,000. With hundreds of thousands installed per site and a 2–3-year refresh cycle, hardware replacement alone can exceed $1 billion annually for mega facilities. Retired GPUs and servers require secure recycling, often costing $200–$300 per unit. The cooling demands of 5,000–7,000 litres per megawatt-hour expose operators to drought surcharges and reputational risks.
Even minor inefficiencies, when scaled across a gigawatt campus, can translate into hundreds of millions of dollars in additional operating and environmental costs. AI’s economic promise must now be matched by accountability — a shift towards Responsible Intelligence.
Data centres should be treated not just as technical assets but as environmental and fiscal entities. It is necessary to incorporate mandatory disclosure of Energy Usage Effectiveness (EUE), Water Usage Effectiveness (WUE) and lifecycle carbon data for every major AI model.
Companies must use model compression and knowledge distillation to cut energy per inference by up to 80%. Build near hydro, solar, or nuclear museogrids. Google’s Hamina, Finland, site already repurposes waste heat to warm local homes; governments should extend tax incentives for carbon-neutral facilities, require sustainability disclosures above defined computational thresholds; and educate users to optimise prompts and adopt lightweight assistants, which can collectively save gigawatt-hours each year.
Though AI data centres have become the steel mills of the digital age, their exponential costs and externalities reveal a paradox: intelligence that risks outpacing its sustainability. As nations and corporations race towards the 1-gigawatt frontier, success will belong not to those who build the biggest, but to those who can pursue genuinely sustainable intelligence.
Oman Observer is now on the WhatsApp channel. Click here