Comment by mentalgear

21 days ago

There are thresholds for every technology where it is "good enough", same with LLMs or SLMs (on-device). Machine learning is already running on-device for photo classification/search/tagging, and even 1.5b models are getting fast really good, as long as they are well trained and used for the right task. Something like email writing, TTS and rewriting and other tasks should be easily doable, the "semantic search aspect" of chatbots are basically a new way of "google/web search" and probably stay in the cloud, but that's not their most crucial use.

Not a big fan of Apple's monopoly, but I like their privacy on-device handling. I don't care for Apple but on-device models are definitely the way to go from a consumer point of view.