Comment by thaumasiotes

8 days ago

> When I talk about the chain "seeing" a sequence, I mean that the sequence existed in the material that was used to generate the probability table.

> My instinct is to believe that you know this, but are being needlessly pedantic.

> My point is that if you're using a context length of two, if you prompt a Markov Chain with "my cat", but the sequence "my cat was" never appeared in the training material, than a Markov Chain will never choose "was" as the next word.

What you claim here is false. You're making extremely strong, and unwarranted, assumptions about how the probability table "must" be generated.