Comment by ProllyInfamous
3 days ago
I literally just turned a fifteen year old MacPro5,1 into an Ollama terminal, using an ancient AMD VEGA56 GPU running Ubuntu 22... and it actually responds faster than I can type (which surprised me considering the age of this machine).
No former Linux experience, beyond basic Mac OS Terminal commands. Surprisingly simple setup... and I used an online LLM to hold my hand as we walked through the installation / setup. If I wanted to call the CLI, I'd have to ask an online LLM what that code even is (something something ollama3.2).
>ollama is probably the easiest tool ... to experiment with LLMs locally.
Seems quite simple so far. If I can do it (blue collar electrician with no programming experience) than so can you.
No comments yet
Contribute on Hacker News ↗