Comment by simonw
10 months ago
It's been a solid trend for the last two years: I've not upgraded my laptop in the time and the quality of results I'm getting from local models on that same machine has continued to rise.
My hunch is that there's still some remaining optimization fruit to be harvested but I expect we may be nearing a plateau. I may have to upgrade from 64GB of RAM this year.
Seeing diffusion language models mature and get better will be interesting. They can be much, much faster on less hardware.