Comment by Solvency
2 years ago
Out of curiosity related to the word vectorization algorithm...why does one word not perform as well? Whats the cause/rationale?
2 years ago
Out of curiosity related to the word vectorization algorithm...why does one word not perform as well? Whats the cause/rationale?
It's pure speculation, but articles embeddings are computed using 512 tokens, which is roughly equivalent to 400 words. I think that using only one word does not allow the model to fully understand the context.