Comment by raincole
20 days ago
If you reduce energy consumption of training a new model by 25%, OpenAI will just buy more hardware and try to churn out a new model 25% faster. The total consumption will be exactly the same.
And they're 100% justified to do so, until they hit another bottleneck (when there is literally not that much Nvidia hardware to buy, for example.)
Not only that, every optimization gain makes it more attractive and creates even more demand, ie. effective energy usage will not decrease or stay the same - it will increase.
It's like with electric cars - if you make them more efficient, it doesn't mean less electricity will be used but more as it'll become more attractive, more people will switch to electric cars.