Comment by sdeframond
1 day ago
Such results are inherently limited because a same word can have different meanings depending on context.
The role of the Attention Layer in LLMs is to give each token a better embedding by accounting for context.
1 day ago
Such results are inherently limited because a same word can have different meanings depending on context.
The role of the Attention Layer in LLMs is to give each token a better embedding by accounting for context.
No comments yet
Contribute on Hacker News ↗