← Back to context

Comment by K0balt

18 days ago

I think the author is using local-first as in “your files stay local, and the framework is compatible with on-prem infra”. Aside from not storing your docs and data with a cloud service though, it’s very usable with cloud inference providers, so I can see your point.

Maybe the author should have specified that capability, even though it seems redundant, since local-first implies local capability but also cloud compatibility, or it would be local or local-only.

It's called "LocalGPT". It's a bad name.

  • Yeah, it’s not exactly great lol. Could be the vision behind the project though, from an aspirational standpoint. But yeah, it kinda implies it will be more like ollama or vLLM.