Comment by redcobra762

2 years ago

There's got to be a probability cut-off, though. LLMs don't infinitely connect every token with every other token, some aren't connected at all, even if some association is taught, right?

The weights have finite precision which means they represent value-ranges / have error bars. So even if the weight is exactly 0 it does not represent complete confidence in it never occurring.