Comment by simonw
10 months ago
LM Studio and Ollama are both very low complexity ways to get local LLMs running on a Mac.
As a Python person I've found uv + MLX to be pretty painless on a Mac too.
10 months ago
LM Studio and Ollama are both very low complexity ways to get local LLMs running on a Mac.
As a Python person I've found uv + MLX to be pretty painless on a Mac too.
No comments yet
Contribute on Hacker News ↗