Comment by petterroea
16 days ago
Even a 25% reduction in resource usage will probably not be enough, AI datacenters are still a huge resource sink after all
16 days ago
Even a 25% reduction in resource usage will probably not be enough, AI datacenters are still a huge resource sink after all
If you reduce energy consumption of training a new model by 25%, OpenAI will just buy more hardware and try to churn out a new model 25% faster. The total consumption will be exactly the same.
And they're 100% justified to do so, until they hit another bottleneck (when there is literally not that much Nvidia hardware to buy, for example.)
Not only that, every optimization gain makes it more attractive and creates even more demand, ie. effective energy usage will not decrease or stay the same - it will increase.
It's like with electric cars - if you make them more efficient, it doesn't mean less electricity will be used but more as it'll become more attractive, more people will switch to electric cars.
There's no gain to be had there at all. Any optimizations that reduce resource usage per output will be gobbled up by just making more output.
OpenAI released an open source model only because they are capped on growth right now by the amount if hardware they have. Improve resource efficiency and you better believe they'll just crank up use of said resources until they capped again.
Itll basically be the same treadmill as the "just one more lane" fallacy DoTs keep falling for.
I imagine there's a lot more to be gained than that via algorithmic improvements. But at least in the short term, the more you cut costs (and prices), the more usage will increase.