← Back to context

Comment by inferiorhuman

5 days ago

  Necessarily, LLM output that works isn't gibberish.

Hardly. Poorly conjured up code can still work.

"Gibberish" code is necessary code which doesn't work. Even in the broader use of the term: https://en.wikipedia.org/wiki/Gibberish

Especially in this context, if a mystery box solves a problem for me, I can look at the solution and learn something from that solution, c.f. how paper was inspired by watching wasps at work.

Even the abject failures can be interesting, though I find them more helpful for forcing my writing to be easier to understand.