← Back to context Comment by hnfong 14 hours ago Yes, but make sure you grab the latest llama.cpp releaseNew model archs usually involve code changes. 2 comments hnfong Reply sowbug 1 hour ago If you're running Ollama, you'll have to wait a little longer for its embedded version of llama.cpp to catch up. It can be a couple days or weeks behind. cpburns2009 14 hours ago Awesome! It looks like the llama.cpp-hip AUR was updated today to b8179, and it works.
sowbug 1 hour ago If you're running Ollama, you'll have to wait a little longer for its embedded version of llama.cpp to catch up. It can be a couple days or weeks behind.
cpburns2009 14 hours ago Awesome! It looks like the llama.cpp-hip AUR was updated today to b8179, and it works.
If you're running Ollama, you'll have to wait a little longer for its embedded version of llama.cpp to catch up. It can be a couple days or weeks behind.
Awesome! It looks like the llama.cpp-hip AUR was updated today to b8179, and it works.