← Back to context

Comment by cortesoft

9 hours ago

LLMs are just translating text into output, too, and are running on deterministic computers like every other bit of code we run. They aren't magic.

It is just the scope that makes it appear non-deterministic to a human looking at it, and it is large enough to be impossible for a human to follow the entire deterministic chain, but that doesn't mean it isn't in the end a function that translates input data into output data in a deterministic way.

just text !== syntactically correct code that solves a defined problem

There is a world of difference between translation and generation. It's even in the name: generative AI. I didn't say anything about magic.