Sam Altman has pushed back against claims that ChatGPT consumes excessive energy, saying critics often make unfair comparisons.
Speaking on the sidelines of a major AI summit, he argued that “it also takes a lot of energy to train a human.”
During an interview with The Indian Express, Altman addressed concerns about the environmental cost of artificial intelligence.
He said it is unfair to compare the energy required to train an AI model with the energy used for a single human “inference query.” According to him, critics often ignore the years of nurturing, education, and resources required to develop human intelligence.
“It takes a lot of energy to train a human,” Altman remarked, adding that it takes around 20 years of life and all the food consumed during that time before someone becomes capable of independent thinking.
He further said that, in a broader sense, human learning is the result of thousands of years of evolution involving billions of people who learned to survive and develop scientific knowledge.
Water usage claims called “fake”
Altman also dismissed what he described as “totally insane” claims circulating online that OpenAI uses excessive amounts of water to power ChatGPT.
“Water is totally fake,” he said, referring to viral claims that each ChatGPT query consumes large amounts of water.
He explained that while evaporative cooling was previously used in data centers, it is no longer the standard practice. Online claims such as “Don’t use ChatGPT, it’s 17 gallons of water for each query,” are inaccurate, he suggested.
How Much Energy Does ChatGPT Use?
In June, Altman wrote on X that the average ChatGPT query consumes about 0.34 watt-hours of electricity.
He compared this to the energy used by an oven in just over one second or a high-efficiency lightbulb running for a few minutes.
While defending ChatGPT’s efficiency, Altman acknowledged that it is fair to examine the AI industry’s overall energy consumption due to rapid growth in usage.
Push for alternative energy sources
Altman said that concerns over large-scale energy use are one reason why he and other AI industry leaders support alternative energy solutions, including solar, wind, and nuclear power.
Unlike Elon Musk, whose company xAI has explored ambitious ideas such as space-based data centers, Altman expressed skepticism about the feasibility of such concepts within the next decade.
Outside of OpenAI, Altman has invested heavily in nuclear energy. He previously served as chairman of Oklo and has backed Helion, which aims to build what it describes as the world’s first fusion power plant in Washington state.
In the United States, data center energy consumption is becoming a major policy issue.
Last month, Donald Trump said he was working with tech companies to ensure Americans do not face higher electricity bills due to nearby data centers.
Consulting firm McKinsey & Company previously estimated that data centers could account for 14% of total US power demand by 2050, highlighting the scale of the challenge ahead.







