Comment by tom_0

3 days ago

Oh, I can't believe I missed that! That makes whisper.cpp and llama.cpp valid options if the user has Nvidia, thanks.

Whisper.cpp and llama.cpp also work with Vulkan.

  • Yeah, I researched this and I absolutely missed this whole part. To my defense I looked into this in 2023 which is ages ago :) Looks like local models are getting much more mature.