Comment by afiori
6 months ago
Do LLMs inference engines have a way to seed their randomness? so tho have reproducible outputs with still some variance if desired?
6 months ago
Do LLMs inference engines have a way to seed their randomness? so tho have reproducible outputs with still some variance if desired?
Yes, although it's not always exposed to the end user of LLM providers.