← Back to context Comment by apitman 4 days ago Is there a way to get LM Studio to talk to remote OpenAI API servers and cloud providers? 6 comments apitman Reply pmarreck 4 days ago If you have a Mac, I recommend https://boltai.com/ for that HumanOstrich 4 days ago LM Studio is for hosting/serving local LLMs. Its chat UI is secondary and is pretty limited. apitman 4 days ago Good to know, thanks. What do people generally use to connect to it for chat? evgen 3 days ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go. antimius 3 days ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
HumanOstrich 4 days ago LM Studio is for hosting/serving local LLMs. Its chat UI is secondary and is pretty limited. apitman 4 days ago Good to know, thanks. What do people generally use to connect to it for chat? evgen 3 days ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go. antimius 3 days ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
apitman 4 days ago Good to know, thanks. What do people generally use to connect to it for chat? evgen 3 days ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go. antimius 3 days ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
evgen 3 days ago OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go.
antimius 3 days ago Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support. 1 reply →
If you have a Mac, I recommend https://boltai.com/ for that
LM Studio is for hosting/serving local LLMs. Its chat UI is secondary and is pretty limited.
Good to know, thanks. What do people generally use to connect to it for chat?
OpenWebUI seems to be the standard. Easy to spin it up in a docker container pointed to 127.0.0.1:1234/v1 and away you go.
Msty (msty.app). Currently they're working on Msty Studio which is only accessible to people with a license, but the desktop app is pretty good, it just doesn't have tool (MCP) support.
1 reply →