← Back to context Comment by kragen 10 days ago That's really exciting! Can you run DeepSeek locally? 1 comment kragen Reply CuriouslyC 10 days ago Maybe if you quantize the everliving hell out of it, but I wouldn't. Depending on how much ram you have I'd run qwen coder, gemma3 or if you can go bigger (huge workstation with research class gpu) I'd go gpt-oss120 or GLM air.
CuriouslyC 10 days ago Maybe if you quantize the everliving hell out of it, but I wouldn't. Depending on how much ram you have I'd run qwen coder, gemma3 or if you can go bigger (huge workstation with research class gpu) I'd go gpt-oss120 or GLM air.
Maybe if you quantize the everliving hell out of it, but I wouldn't. Depending on how much ram you have I'd run qwen coder, gemma3 or if you can go bigger (huge workstation with research class gpu) I'd go gpt-oss120 or GLM air.