← Back to context

Comment by buyucu

6 months ago

the work involved is tiny compared to the work llama.cpp did to get vulkan up and running.

this is not rocket science.

This sounds like it should be trivial to reproduce and extend - I look forward to trying out your repo!

  • the owner of that PR has already forked ollama. try it out. I did and it works great.

    • I guess git and GitHub are working as intended then.

      This is not a sarcastic comment. I'm genuinely happy that this was the outcome.