Comment by sdeframond
2 days ago
Such results are inherently limited because a same word can have different meanings depending on context.
The role of the Attention Layer in LLMs is to give each token a better embedding by accounting for context.
2 days ago
Such results are inherently limited because a same word can have different meanings depending on context.
The role of the Attention Layer in LLMs is to give each token a better embedding by accounting for context.
No comments yet
Contribute on Hacker News ↗