Comment by Workaccount2
1 year ago
People never talk about Gemini, and frankly it's output is often the worst of SOTA models, but it's 2M context window is insane.
You can drop a few textbooks into the context window before you start asking questions. This dramatically improves output quality, however inference does take much much longer at large context lengths.
No comments yet
Contribute on Hacker News ↗