← Back to context

Comment by rand0mwalk

3 months ago

Tokens start as a special [MASK] token. Then as the diffusion process runs they are "unmasked" i.e. sampled.

So yes, you define a sequence of [MASK] tokens with some length ahead of time.

In practice, if a model wants to write a shorter sequence, it'll just fill the remaining tokens with empty content. If it wants to write a longer sequence, you'll have to identify this and extend the sequence with more [MASK] tokens. This is typically obvious since there's no "end of sequence" token present if the model wants to generate more.