← Back to context Comment by replete 2 hours ago Run server with ollama, use Continue extension configured for ollama 1 comment replete Reply BoredomIsFun 1 hour ago I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.
BoredomIsFun 1 hour ago I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.
I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.