← Back to context Comment by replete 1 month ago Run server with ollama, use Continue extension configured for ollama 3 comments replete Reply BoredomIsFun 1 month ago I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible. mika6996 1 month ago But you can't just switch between installed models like in ollama, can you? BoredomIsFun 1 month ago llama-swap? https://www.nijho.lt/post/llama-nixos/
BoredomIsFun 1 month ago I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible. mika6996 1 month ago But you can't just switch between installed models like in ollama, can you? BoredomIsFun 1 month ago llama-swap? https://www.nijho.lt/post/llama-nixos/
mika6996 1 month ago But you can't just switch between installed models like in ollama, can you? BoredomIsFun 1 month ago llama-swap? https://www.nijho.lt/post/llama-nixos/
I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.
But you can't just switch between installed models like in ollama, can you?
llama-swap? https://www.nijho.lt/post/llama-nixos/