Comment by alasr
19 hours ago
> What exactly is the difference between lms and llmsterm?
With lms, LM Studio's frontend GUI/desktop application and its backend LLM API server (for OpenAI compatibility API endpoints) are tightly coupled: stopping LM Studio's GUI/desktop application will trigger stopping of LM Studio's backend LLM API server.
With llmsterm, they've been decoupled now; it (llmsterm) enables one, as LM Studio announcement says, to "deploy on servers, deploy in CI, deploy anywhere" (where having a GUI/desktop application doesn't make sense).
But like, llmsterm still results in using the `lms` command, right? Or am I misreading the docs?
I think you're reading the docs correct: one still uses "lms server [command]" command to manage an LM Studio (LMS) server.