Comment by gettingoverit
2 days ago
Okay, this will be a bit on a conspiracy theory side, but there was a paper recently describing how to do matrix computations on a RAM stick connected to FPGA, and they've shown it's possible to do it cheaper per flop that GPUs. Except, of course, there's a variety of RAM producers.
It might either be artificial to keep GPU prices where they are, or someone already started building their RAM-based AI datacenter.
No comments yet
Contribute on Hacker News ↗