Comment by justinclift
2 months ago
Interesting. It sounds like using local LLMs (via vllm, ollama, etc) with decent agentic capability might be starting to become a reality.
Next step, just need a shitload of vram. ;)
Maybe those Intel Battlematrix 48GB cards might be useful after all... :)
https://www.storagereview.com/review/intel-arc-pro-b60-battl...
No comments yet
Contribute on Hacker News ↗