Comment by backscratches
18 days ago
It is not local first. Local is not the primary use case. The name is misleading to the point I almost didn't click because I do not run local models.
18 days ago
It is not local first. Local is not the primary use case. The name is misleading to the point I almost didn't click because I do not run local models.
I think the author is using local-first as in “your files stay local, and the framework is compatible with on-prem infra”. Aside from not storing your docs and data with a cloud service though, it’s very usable with cloud inference providers, so I can see your point.
Maybe the author should have specified that capability, even though it seems redundant, since local-first implies local capability but also cloud compatibility, or it would be local or local-only.
It's called "LocalGPT". It's a bad name.
Yeah, it’s not exactly great lol. Could be the vision behind the project though, from an aspirational standpoint. But yeah, it kinda implies it will be more like ollama or vLLM.