← Back to context

Comment by pembrook

5 hours ago

Obviously no. AI is nowhere near as ubiquitous as the microwave so adoption is still scaling.

But as chips improve and the algorithms improve (eg. a paper just came out about getting the same results with 90% less inference using a few algorithmic techniques...on top of the fact we've already had multiple 90% efficiency jumps in AI already) the energy use per prompt will drop over time.

Meanwhile energy use per microwave minute will not meaningfully improve over time. So to make the comparison is silly.

And to pretend like the efficiency of AI will never improve given it runs on compute which by definition constantly becomes more efficient, is dumb.