Comment by apitman 6 months ago Is there a way to get LM Studio to talk to remote OpenAI API servers and cloud providers? 6 comments apitman Reply pmarreck 6 months ago If you have a Mac, I recommend https://boltai.com/ for that HumanOstrich 6 months ago LM Studio is for hosting/serving local LLMs. Its chat UI is secondary and is pretty limited. apitman 6 months ago Good to know, thanks. What do people generally use to connect to it for chat? evgen 6 months ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go. antimius 6 months ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
HumanOstrich 6 months ago LM Studio is for hosting/serving local LLMs. Its chat UI is secondary and is pretty limited. apitman 6 months ago Good to know, thanks. What do people generally use to connect to it for chat? evgen 6 months ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go. antimius 6 months ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
apitman 6 months ago Good to know, thanks. What do people generally use to connect to it for chat? evgen 6 months ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go. antimius 6 months ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
evgen 6 months ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go.
antimius 6 months ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
If you have a Mac, I recommend https://boltai.com/ for that
LM Studio is for hosting/serving local LLMs. Its chat UI is secondary and is pretty limited.
Good to know, thanks. What do people generally use to connect to it for chat?
OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go.
Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support.
1 reply →