← Back to context

Comment by nosecreek

8 days ago

Related question: what is everyone using to run a local LLM? I'm using Jan.ai and it's been okay. I also see OpenWebUI mentioned quite often.

LM studio if you just want an app. openwebui is just a front end - you'd need to have either llama.cpp or vllm behind it to serve the model