← Back to context

Comment by ssalka

1 day ago

Personally, I would not run LM Studio anywhere outside of my local network as it still doesn't support adding an SSL cert. I guess you can just layer a proxy server on top of it, but if it's meant to be easy to set up, it seems like a quick win that I don't see any reason not to build support for.

https://github.com/lmstudio-ai/lmstudio-bug-tracker/issues/1...

Adding Caddy as a proxy server is literally one line in Caddyfile, and I trust Caddy to do it right once more than I trust every other random project to add SSL.

Because adding a caddy/nginx/apache + letsencrypt is a couple of bash commands between install and setup, and those http servers + TLS termination is going to be 100x better than what LMS adds themselves, as it isn't their core competency.

If you're running your apps on Kubernetes, standard ingress supports certs. For small applications, Cloudflare TLS on free tier is dead simple

  • Cloudflare tunnels makes this easy as it gets. It also makes it easy for only you to have access to it, either through sign in or OTPs.

    You don’t want some random person to find your LMStudio service and then point their Opencode at it.

tbh I would prefer that an application not do this, and allow me the choice and control of putting a proxy in front of it.

analog could be car infotainment systems: don't give me your half-baked shitty infotainment, i have carplay, let me use it.