Comment by teleforce
10 months ago
If LLM can leverage on the new efficient attention mechanism based the FFT architecture discovered by Google then FPGA can be the new hot stuff [1]:
[1] The FFT Strikes Back: An Efficient Alternative to Self-Attention (168 comments):
No comments yet
Contribute on Hacker News ↗