Comment by api
3 days ago
Hmm... in that case the analogy with AI is even better. This sounds like neural networks before things like deep learning and the transformer architecture -- before we figured out how to scale them. Turns out this did require some innovations. It wasn't just a matter of making a bigger model.
No comments yet
Contribute on Hacker News ↗