Comment by xmcqdpt2
1 month ago
You can understand how transformers work from just reading the Attention is All You Need paper, which is 15 pages of pretty accessible DL. That's not the part that is impressive about LLMs.
1 month ago
You can understand how transformers work from just reading the Attention is All You Need paper, which is 15 pages of pretty accessible DL. That's not the part that is impressive about LLMs.
No comments yet
Contribute on Hacker News ↗