← Back to context

Comment by yen223

16 hours ago

"It is fundamental to language modeling that every sequence of tokens is possible."

This isn't true, is it? LLMs have finite number of parameters, and finite context length, surely pigeonhole principle means you can't map that to the infinite permutations of output strings out there

No, it's not literally true, it's a mental model. I've added some clarification at the bottom of the comment.