← Back to context

Comment by repelsteeltje

4 hours ago

Interesting take. Hadn't thought of it in terms of entropy, but it's true. Almost by definition as the training proces doesn't introduce anything novel beyond scraped inputs and a randomly initialized network. From there, the stochastic generation only adds randomness (and the prompt, of course).

Generally I think this is a legitemate issue, although:

> the training process doesn't introduce anything novel

This is not always the case. A compiler, linter, proof checker, tests, etc. can all lower entropy.