Comment by alcasa
20 hours ago
Really cool, especially once 256k context size becomes available.
I think higher performance will be a key differentiator in AI tool quality from a user perspective, especially in use-cases where model quality is already sufficiently good for human-in-loop usage.
No comments yet
Contribute on Hacker News ↗