Slacker News Slacker News logo featuring a lazy sloth with a folded newspaper hat
  • top
  • new
  • show
  • ask
  • jobs
Library
← Back to context

Comment by jasonjmcghee

3 months ago

Depends on quantization. 109B at 4-bit quantization would be ~55GB of ram for parameters in theory, plus overhead of the KV cache which for even modest context windows could jump total to 90GB or something.

Curious to here other input here. A bit out of touch with recent advancements in context window / KV cache ram usage

0 comments

jasonjmcghee

Reply

No comments yet

Contribute on Hacker News ↗

Slacker News

Product

  • API Reference
  • Hacker News RSS
  • Source on GitHub

Community

  • Support Ukraine
  • Equal Justice Initiative
  • GiveWell Charities