Altman Dismisses AI Water Use Myths, Calls for Energy Shift
In a recent talk at a major AI summit in India, OpenAI CEO Sam Altman directly confronted widespread claims about the environmental toll of artificial intelligence. He labeled online assertions...
In a recent talk at a major AI summit in India, OpenAI CEO Sam Altman directly confronted widespread claims about the environmental toll of artificial intelligence. He labeled online assertions that a single ChatGPT query consumes gallons of water as "completely untrue" and "totally insane," attributing such figures to outdated data center cooling methods no longer in common use.
Altman did, however, validate broader concerns about the sector's total energy consumption as global AI use expands. "It's fair to worry about the energy consumption — not per query, but in total," he stated. His proposed solution is a rapid, large-scale transition to nuclear, wind, and solar power to meet this new demand.
The discussion turned to efficiency comparisons between humans and machines. Responding to a question about the energy cost of a ChatGPT query, Altman argued that typical comparisons are flawed. He suggested a more equitable measure: the energy required for a trained AI to answer a question versus the energy a human uses to do the same. He posited that AI has likely already achieved parity or better by that metric, noting the immense biological and evolutionary energy investment required to educate a single person.
With no legal mandate for tech firms to disclose resource use, independent scientists continue to study the true impact of data centers, which have been linked to rising electricity costs in some regions. Altman's core argument reframes the debate: the challenge isn't the efficiency of individual AI tasks, but securing enough clean energy to power the technology's future.
Source: TechCrunch
Ready to Modernize Your Business?
Get your AI automation roadmap in minutes, not months.
Analyze Your Workflows →