← Back to context

Comment by apitman

4 days ago

Is there a way to get LM Studio to talk to remote OpenAI API servers and cloud providers?

LM Studio is for hosting/serving local LLMs. Its chat UI is secondary and is pretty limited.

  • Good to know, thanks. What do people generally use to connect to it for chat?

    • OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go.

    • Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support.

      1 reply →