← Back to context

Comment by layer8

2 days ago

Temperature > 0 isn’t a problem as long as you can specify/save the random seed and everything else is deterministic. Of course, “as long as” is still a tall order here.

My understanding is that the implementation of modern hosted LLMs is nondeterministic even with known seed because the generated results are sensitive to a number of other factors including, but not limited to, other prompts running in the same batch.

Have any of the major hosted LLMs ever shared the temperature parameters that prompts were generated with?