Comment by pshirshov
6 hours ago
Even tried gemma4:31b and gemma4:31b with 128k context (I have 72GiB VRAM). Nothing. I'm cursed I guess. That's ollama-rocm if that matters (I had weird bugs on Vulkan, maybe gemma misbehaves on radeons somehow?..).
UPD: tried ollama-vulkan. It works, gemma4:31b-it-q8_0 with 64k context!
No comments yet
Contribute on Hacker News ↗