← Back to context Comment by Skidaddle 4 hours ago But local models will never compete quality-wise with frontier models in a data center. 2 comments Skidaddle Reply gruez 4 hours ago Yeah, kimi k2 apparently requires 2TB of vram to run[1], and trails the proprietary models in terms of intelligence. There's no world where people are going to be replacing chatgpt or claude code with a local model.[1] https://www.canirun.ai/model/kimi-k2 baal80spam 4 hours ago Never say never. Besides, they just need to be "good enough".
gruez 4 hours ago Yeah, kimi k2 apparently requires 2TB of vram to run[1], and trails the proprietary models in terms of intelligence. There's no world where people are going to be replacing chatgpt or claude code with a local model.[1] https://www.canirun.ai/model/kimi-k2
Yeah, kimi k2 apparently requires 2TB of vram to run[1], and trails the proprietary models in terms of intelligence. There's no world where people are going to be replacing chatgpt or claude code with a local model.
[1] https://www.canirun.ai/model/kimi-k2
Never say never. Besides, they just need to be "good enough".