Slacker News Slacker News logo featuring a lazy sloth with a folded newspaper hat
  • top
  • new
  • show
  • ask
  • jobs
Library

Comment by nosecreek

8 days ago

Related question: what is everyone using to run a local LLM? I'm using Jan.ai and it's been okay. I also see OpenWebUI mentioned quite often.

3 comments

nosecreek

Reply

Havoc  8 days ago

LM studio if you just want an app. openwebui is just a front end - you'd need to have either llama.cpp or vllm behind it to serve the model

op00to  8 days ago

LMStudio, and sometimes AnythingLLM.

fennecfoxy  8 days ago

KoboldCPP + SillyTavern, has worked the best for me.

Slacker News

Product

  • API Reference
  • Hacker News RSS
  • Source on GitHub

Community

  • Support Ukraine
  • Equal Justice Initiative
  • GiveWell Charities