Comment by rhdunn
8 days ago
It is feasible to run 7B, 8B models with q6_0 in 8GB VRAM, or q5_k_m/q4_k_m if you have to or want to free up some VRAM for other things. With q4_k_m you can run 10B and even 12B models.
8 days ago
It is feasible to run 7B, 8B models with q6_0 in 8GB VRAM, or q5_k_m/q4_k_m if you have to or want to free up some VRAM for other things. With q4_k_m you can run 10B and even 12B models.
No comments yet
Contribute on Hacker News ↗