← Back to context

Comment by thomasfromcdnjs

7 months ago

But having the same "seed" doesn't guarantee the same response from an LLM, hence the question above.

I fail to understand how an LLM could produce two different responses from the same seed. Same seed implies all random numbers generated will be the same. So where is the source of nondeterminism?

  • I believe people are confused because ChatGPT's API exposes a seed parameter which is not guaranteed to be deterministic.

    But that's due to the possibility model configuration changes on the service end and not relevant here.

Barring subtle incompatibilities in underlying implementations on different environments, it does, assuming all other generation settings (temperature, etc.) are held constant.