Comment by afiori
2 days ago
Do LLMs inference engines have a way to seed their randomness? so tho have reproducible outputs with still some variance if desired?
2 days ago
Do LLMs inference engines have a way to seed their randomness? so tho have reproducible outputs with still some variance if desired?
Yes, although it's not always exposed to the end user of LLM providers.