Comment by api

10 months ago

I'm really impressed and also very interested to see models I can run on my MacBook Pro start to generate results close to large hosted "frontier" models, and do so with what I assume are far fewer parameters.

I wonder how far this can go?

It's been a solid trend for the last two years: I've not upgraded my laptop in the time and the quality of results I'm getting from local models on that same machine has continued to rise.

My hunch is that there's still some remaining optimization fruit to be harvested but I expect we may be nearing a plateau. I may have to upgrade from 64GB of RAM this year.

  • Seeing diffusion language models mature and get better will be interesting. They can be much, much faster on less hardware.