← Back to context

Comment by icedchai

19 days ago

Ok. I don’t think hosting a capable open model is seriously a realistic option for the vast majority of consumers.

Full LLM, no. Not yet.

But there’s new things like sweep [0] that you now can do locally.

And 2-3 years ago capable open models weren’t even a thing. Now we’ve made progress on that front. And I believe they’ll keep improving (both on accessibility and competency).

[0]: https://news.ycombinator.com/item?id=46713106