← Back to context

Comment by AndrewKemendo

17 hours ago

Self host your own LLM

Why do you think this would be less discoverable than hosting your own email server?

  • If you use a stateless client (like just rawdogging cli llama.cpp) there’s nothing to discover. Setting a program with an option to have logs to not do that could conceivably get you in trouble but using a widely used program that never had logs seems like it has to be fine. Maybe they could nail you for googling “which local llm approach generates logs?” also, don’t get nailed by your bash history!