← Back to context

Comment by wavemode

7 months ago

I fail to understand how an LLM could produce two different responses from the same seed. Same seed implies all random numbers generated will be the same. So where is the source of nondeterminism?

I believe people are confused because ChatGPT's API exposes a seed parameter which is not guaranteed to be deterministic.

But that's due to the possibility model configuration changes on the service end and not relevant here.