Comment by preommr
2 months ago
At the risk of being pedantic, it's not AI that requires massive resources, chatgpt 3.x was trained on a few million dollars. The jump to trillions being table stakes happened because everyone started using free services and there was just too much money in the hands of these tech companies. Among other things.
There are so many chickens that are coming home to roost where LLMs was just the catalyst.
> it's not AI that requires massive resources
no it really is. If you took away training costs, OpenAI would be profitable.
When I was at meta they were putting in something like 300k GPUs in a massive shared memory cluster just for training. I think they are planning to triple that, if not more.
Yeah for some reason AI energy use is so overreported. Using chatgpt for query does not even use two order of magnitude less energy compared to toasting a bread. And you can eat bread untoasted too if you care about energy use.
[1]: https://epoch.ai/gradient-updates/how-much-energy-does-chatg...
How many slices of toast are you making a day?
If you fly a plane a millimeter, you're using less energy than making a slice of toast; would you also say that it's accurate that all global plane travel is more efficient than making toast?
1-2 slice a day and 1-50 chatgpt query per day. For me it would be within same order of magnitude, and I don't really care about both as both of them are dwarfed by my heater or aircon usage.
From my estimation each second of gpt eats about 0.5-1.5 watthours
You can say it takes 1800-5400 W. Not sure where you are estimating it from.
2 replies →