Comment by gruez
4 hours ago
Yeah, kimi k2 apparently requires 2TB of vram to run[1], and trails the proprietary models in terms of intelligence. There's no world where people are going to be replacing chatgpt or claude code with a local model.
4 hours ago
Yeah, kimi k2 apparently requires 2TB of vram to run[1], and trails the proprietary models in terms of intelligence. There's no world where people are going to be replacing chatgpt or claude code with a local model.
No comments yet
Contribute on Hacker News ↗