Comment by julien040
2 years ago
It's pure speculation, but articles embeddings are computed using 512 tokens, which is roughly equivalent to 400 words. I think that using only one word does not allow the model to fully understand the context.
2 years ago
It's pure speculation, but articles embeddings are computed using 512 tokens, which is roughly equivalent to 400 words. I think that using only one word does not allow the model to fully understand the context.
No comments yet
Contribute on Hacker News ↗