← Back to context

Comment by 0xbadcafebee

3 days ago

My dude, when people say LLMs are non-deterministic, this is what they mean. You cannot expect an LLM to always follow your prompts.

When this happens, end your session and try again. If it keeps happening, change your model settings to lower temp, top_k, top_p. (https://www.geeksforgeeks.org/artificial-intelligence/graph-...)