Comment by swyx
4 days ago
finally, what took so long lmao
if im being honest i care more about multiple local ai apps on my desktop all hooking into the same ollama instance rather than all downloading their own models as part of the app so i have like multiple 10s of gbs of repeated weights all over the place because apps dont talk to each other
what does it take for THAT to finally happen
I haven't used a local model in a while but ollama was the only one I've seen convert models into a different format. (I think for reduplication). You should be able to say download a gguf file and point a bunch of frontends to that same file.
this is something we are working on. I don't have a specific timeline since it's done when its done, but it is being worked on.
That's already possible via the ollama API. It's up to applications themselves to support it (and plenty do).
And this move by Ollama is going exactly in the wrong direction.
Its finally the push I need to move away. I predict ollama will only get worse from here on.
I, too, dream of this.
symlinks