Comment by AureliusMA
3 hours ago
I remember when a large context was 8k! Nowadays that would seem extremely small, because we have new use-cases that require much larger context sizes. Maybe in the future, we will invent ways to use inference on very large contexts that we cannot even imagine today.
No comments yet
Contribute on Hacker News ↗