Comment by storus
1 day ago
QKV attention is just a probabilistic lookup table where QKV allow adjusting dimensions of input/output to fit into your NN block. If your Q perfectly matches some known K (from training) then you get the exact V otherwise you get some linear combination of all Vs weighted by the attention.
It's not, please read the thread above.