Comment by godelski
11 hours ago
> they’re mapped to vectors of real numbers
Yes, I'm in agreement here. But you need to tell me how
a - a + a = b
Use what ever the fuck you want for a. A vector (e.g. [1,2,3]), a number (e.g. 1), an embedding (e.g. [[1,2,3],[4,5,6]]), words (e.g. "man"), I really don't give a damn. You have to tell me why b is a reasonable answer to that equation. You have to tell me how a==b while also a!=b.
Because I expect the usual addition to be
a - a + a = a
This is the last time I'm going to say this to you.
You're telling me I'm lost in abstraction and I'm telling you is not usual addition because a != b. That's it! That's the whole fucking argument. You literally cannot see the contradiction right in front of you. The only why it is usual addition is if you tell me "man == woman" because that is literally the example from several comments ago. Stop being so smart and just read the damn comment
a - a + a = b when a and b map to the same vector (or in practice, extremely close together). Your assumptions about invertibility etc don't hold in this world.... embeddings are just a bunch of empirically learned coordinates in a dense space.
So an example: a maps to [1,2,3] and b maps to [1,2,3] . Again in practice b could map to [1,2,3.0001] or something.
To summarize: king, man etc aren't symbols, they get mapped to vectors. + is element wise addition. = is "equal to or very close in multi dimensional space".
Maybe tone down the attitude. You clearly aren't in this field. The properties you have assumed to be true are not. People in AI/ML are using terms and conventions differently than you assume. When someone says "vector addition" they really do mean just element wise addition in practically every case. You are the fool here.