Comment by wslh

2 months ago

Seems like they just fixed it: [1]. A "thinking longer for a better answer" message appeared before giving the answer.

[1] https://chatgpt.com/share/6897c38b-12b8-800d-9cc2-571adb13bc...

Having to activate their more complex "thinking" model every time they need to count letters is pretty silly, but I suppose it does hide the symptoms.

It's still easy to trip up. The model's tendency to respond positively to user impact will have it do stuff like this: https://chatgpt.com/share/6897cc42-ba34-8009-afc6-41986f5803...

Because apparently the model doesn't know about the actual verb (https://en.wiktionary.org/wiki/blueberry#English), it decides to treat the request as some kind of fantasy linguistics, making up its own definition on the fly. It provides grammatically incorrect examples inconsistent with the grammatically incorrect table of conjugations it generates next.