Comment by hadlock
9 months ago
With 16gb you can comfortably run a 12b model that's been quantized. Quantizing is (bad example) effectively lossy compression.
9 months ago
With 16gb you can comfortably run a 12b model that's been quantized. Quantizing is (bad example) effectively lossy compression.
No comments yet
Contribute on Hacker News ↗