Comment by linuxftw
21 hours ago
I think the story is less about the GPUs themselves, and more about the interconnects for building massive GPU clusters. Nvidia just announced a massive switch for linking GPUs inside a rack. So the next couple of generations of GPU clusters will be capable of things that were previously impossible or impractical.
This doesn't mean much for inference, but for training, it is going to be huge.
No comments yet
Contribute on Hacker News ↗