Comment by bonsai_spool
3 months ago
You miss that we already have ‘context’ when we begin reading something, and that probably enables our fast reading. Maybe there’s a way to give that background setting information to an llm but then we could also just have it read the entire input stream
No comments yet
Contribute on Hacker News ↗