21/02/2026 · WORLD · 3 min read
OpenAI's Sam Altman Reveals Children Are Just Very Slow AI Models
Tech CEO solves the energy debate by pointing out humans also cost a fortune to train, take twenty years, and still can't find the butter.
Sam Altman has settled the great energy debate by explaining that training a human also uses quite a lot of electricity, food and patience, plus the human sometimes refuses to learn.
Speaking at an event in India, the OpenAI chief noted that people “take about 20 years of life, plus all the food you eat during that time, before you get smart.” The audience nodded. Several parents in the room nodded harder.
The Maths
The comparison is, on paper, compelling:
- One large AI model: several months of training, tens of millions of dollars, a regional power grid’s worth of electricity, and it can summarise the entirety of human knowledge in seconds.
- One human child: approximately 7,300 days of training, roughly 30,000 meals, 14,000 school hours, a minimum of six existential crises about GCSEs, and it still texts “where’s the butter” from three feet away from the fridge.
Industry analysts confirmed that on a pure cost-per-correct-answer basis, the child is “not competitive,” though it does offer certain features the model lacks, such as occasionally saying something that makes you cry at a wedding.
Silicon Valley Takes Notes
Venture capitalists responded enthusiastically. One investor was overheard telling colleagues that “we’ve been thinking about this all wrong,” before pitching a startup that replaces the first eighteen years of childhood with a fine-tuning phase.
A leaked internal memo at one major tech firm reportedly explored whether newborns could be “pre-trained on a larger corpus,” though HR flagged the proposal before it left the ideas channel on Slack.
The Parenting Community Responds
The British public received the news with characteristic calm.
One mother from Swindon told reporters: “If my son is a slow AI model, I’d like to file a bug report. He’s been in training for sixteen years and he still loads the dishwasher like he’s never seen a plate.”
A father from Leeds added: “Twenty years and all the food you eat? Try thirty-four years. He’s back from uni. The model is still in beta.”
The Department for Education issued a short statement clarifying that children are “not, technically, language models,” though it conceded that several Year 9 students do appear to operate on pure pattern-matching with no underlying comprehension.
Energy Comparison: A Closer Look
Experts pointed out one key difference Altman omitted: a human brain runs on roughly 20 watts — about the same as a dim light bulb — while a large training run can consume tens of megawatts for months.
“The biological brain is absurdly efficient,” said one Cambridge neuroscientist. “The problem is that it spends most of its capacity deciding what to have for lunch and replaying embarrassing moments from 2011.”
The AI model, by contrast, uses a colossal amount of energy but can be copied infinitely once trained. The human cannot be copied, though several families at half-term wished otherwise.
What Happens Next
Altman is expected to release a follow-up statement clarifying that the comparison was meant to be illustrative, not literal, and that he does not view children as deprecated hardware.
Meanwhile, schools across England have begun referring to assemblies as “alignment sessions” and lunch breaks as “energy replenishment cycles,” largely because it sounds better on the Ofsted report.
Parents, for their part, remain unimpressed. As one put it: “You spend twenty years training them, a fortune feeding them, and they still ring you to ask how to boil an egg. At least ChatGPT knows how to boil an egg. It just can’t eat one.”