Comment by hdivider
1 year ago
I still find it remarkable how we need such an extreme amount of electrical energy to power large modern AI models.
Compare with one human brain. Far more sophisticated, even beyond our knowledge. What does it take to power it for a day? Some vegetables and rice. Still fine for a while if you supply pure junk food -- it'll still perform.
Clearly we have a long, long way to go in terms of the energy efficiency of AI approaches. Our so-called neural nets clearly don't resemble the energy efficiency of actual biological neurons.
Food is extremely dense in energy. 1 food calorie is about 1.1 Watt-hours. A hamburger is about 490 Wh. An AI model requires 0.047 kWh = 47 Wh to generate 1000 text responses.[1] If an LLM could convert hamburgers to energy, it could generate over 10000 prompt completions on a single hamburger.
Based on my own experience, I would struggle to generate that much text without fries and a drink.
[1] https://www.theverge.com/24066646/ai-electricity-energy-watt...
During that time, your brain would do far more than just that text generation though, beyond what we even know scientifically.
But yes, food energy could be useful for AI. A little dystopian potentially too, if you think about it. Like DARPA's EATR robot, able to run on plant biomass (although potentially animal biomass too, including human remains):
https://en.wikipedia.org/wiki/Energetically_Autonomous_Tacti...
AI is more energy-efficient than a human doing the same language-generation task is my point.
It's even less! A lot of those vegetables and rice go into powering your heart, muscles, organs, etc. and only a fraction is used for the brain.
Maybe the future of AI is in organic neurons?
This is more likely to be a hardware issue than an algorithms issue. The brain physically is a neural network, as opposed to a software simulation of one.