Comment by ghc
18 hours ago
You're right, I should have checked the model settings. For some reason the default model profile in Ollama had temperature set to 0. Changing the temperature and repeat penalty worked much better than it did when I tried to correct similar behavior in the smallest phi4 reasoning model.
Thank you, this was affecting me too.