Comment by leroman
1 year ago
I thought they were using some kind of vector space searches like embedding.. no idea if that's the case
1 year ago
I thought they were using some kind of vector space searches like embedding.. no idea if that's the case
Yeah I'm pretty sure you could do this just with the classic word embeddings (king =queen + man - woman). Maybe it doesn't work as well as with a full LLM.
Addition won't work for things that depend on the order of operations. If salt + water is ocean and water + fire is steam, what's salt + water + fire? Is it salt + steam or ocean + fire?
Associativity and commutivity in vector addition doesn't translate well to semantic meaning. Extrapolating your example, it'd also mean:
I don't see why those should all be true. Intuitively, trying to satisfy O(N^2) semantic pairings with vectors that are optimised for a very specific and different numerical operation (cosine similarity) feels like something that won't work. I'd imagine errors get amplified with 3+ operands.
Isn't the reason for lack of associativity/commutivity is that you're doing operations (addition/subtraction) that have them, and then snapping the result to the closest one of fixed number of points in your output dictionary? The addition is fine, loss of information is in the final conversion.
2 replies →