The huge amount of electricity needed to support modern artificial intelligence now uses more power than many developed countries. At the India AI Impact Summit in February 2026, OpenAI CEO Sam Altman changed the conversation about AI energy efficiency by comparing silicon chips to living things. Altman said, “Humans use a lot of energy too,” and pointed out that it takes 20 years of life and a lot of food to “train” a human mind.
Some people see this as a rhetorical shift, but the comparison brings up a key question: What is the real cost of intelligence in a time when power grids are under a lot of stress? Microsoft, Google, and Meta are all buying private nuclear reactors to power their own “compute energy” needs. This has caused a major bottleneck in the race for resources. To figure out how much a query costs, you need to look at both the human brain and the data center.
The Biological Benchmark: 20 Watts vs. Gigawatt-Hours
Sam Altman’s recent comments at the Express Adda event in New Delhi went against the usual way of looking at AI’s impact. Altman said, “It’s never fair to compare the energy it takes to train an AI model to the energy it takes for a person to do one query.” He said that a “fair” comparison should include the 20 years of energy that a person spends raising a child.
But this comparison hides a huge difference in the basic laws of physics. The brain uses about 20 watts of power, which is about the same as a dim LED bulb.
On the other hand, it is thought that training a cutting-edge model like GPT-4 will take more than 50 gigawatt-hours (GWh). For comparison, 50 GWh is enough to run about 3,000 “training cycles” of 20 years each. AI models can handle more information than any human can, but biological models are still the best at being efficient.
What is the real cost of energy for an AI query?
Depending on the size of the model, a single generative AI prompt uses between 0.3 and 1.2 watt-hours of electricity right now. This may seem small, about the same as running an LED light for six minutes, but the total effect of billions of daily queries is what drives global demand.
The Green-to-Base Load Shift: Why AI Needs Nuclear Power
The tech industry is moving away from relying on solar and wind power, which are only available at certain times, because of the rise in energy needs. Data centers need “base load” power that keeps electrons flowing all the time to stay up 99.999% of the time. Hyperscalers have started a “Nuclear Renaissance” because of this need.
In late 2024, Microsoft signed a 20-year deal with Constellation Energy to restart the Three Mile Island nuclear plant. The facility, which is now called the Crane Clean Energy Center, should be up and running by 2027. This deal is a step toward private, “behind-the-meter” energy sources that don’t use the public grid.
- Microsoft: Restarting the Crane Clean Energy Center at Three Mile Island.
- Google: Putting money into Small Modular Reactors (SMRs) to get carbon-free energy 24 hours a day, seven days a week.
- Meta: Getting 1,100 MW of nuclear power from the Clinton, Illinois, plant.
The Intelligence ROI: Is AI Energy Worth the Money?
The main point of the “Return on Intelligence” (ROI) argument is that AI will pay back the energy it owes. People who support AI say it will find better materials or even figure out how to do commercial nuclear fusion. If an AI model uses a terawatt but makes the global grid 10% more efficient, the environment will benefit overall.
Economists, on the other hand, talk about the Jevons Paradox. This principle says that when technology makes a resource more efficient, we don’t use less of it; instead, we find more ways to use it, which makes total consumption go up. History shows that if AI makes electricity cheaper, we will build even bigger data centers, which will create a never-ending cycle of demand.
FAQ: How much energy and resources does AI use?
Does one ChatGPT query really use 17 gallons of water? Sam Altman said this was “completely untrue” and “totally insane.” Older data centers used evaporative cooling, but modern ones use closed-loop liquid cooling or “direct-to-chip” methods that waste much less water.
Is AI already as good as a person?
Some studies say that AI is getting close to human efficiency when it comes to inference (answering one question). But when it comes to learning and general reasoning, the 20-watt structure of the human brain is still much better than current GPU clusters.
How do data centers affect my power bill?
When data centers need more power, it can make electricity prices go up in the area. In some places, like Northern Virginia and Dublin, Ireland, the demand for data centers has caused delays in connecting homes and small businesses to the grid.
Final Decision
As we get closer to 2030, the real problem for AI won’t be human creativity, but the limited supply of electrons. The time of unlimited scaling is running into the hard wall of thermodynamics.
Do you want me to make a table that shows the differences in energy costs for each task between GPT-4o and a human professional?
















