Comment by ajdude
4 days ago
As someone who has been wanting to try out local LLMs on my Mac but didn't want to go through the complicated setup For ollama, this seems to lower the bar for me just enough to try it out. Looks like it's all self-contained and does everything for me so I could just get to the part where I'm working with LLMs.
This might be the push I needed to finally give them a try
If you are interested in no config setup for local LLM, give https://recurse.chat/ a try (I'm the dev). The app is designed to be self-contained and as simple as you can imagine.
Ollama is literally a signed .app, you don’t need to even use GitHub to install it. You may want to try it again if you haven’t recently.