Comment by teleforce
5 days ago
If LLM can leverage on the new efficient attention mechanism based the FFT architecture discovered by Google then FPGA can be the new hot stuff [1]:
[1] The FFT Strikes Back: An Efficient Alternative to Self-Attention (168 comments):
No comments yet
Contribute on Hacker News ↗