← Back to context

Comment by jeroenhd

2 months ago

Having to activate their more complex "thinking" model every time they need to count letters is pretty silly, but I suppose it does hide the symptoms.

It's still easy to trip up. The model's tendency to respond positively to user impact will have it do stuff like this: https://chatgpt.com/share/6897cc42-ba34-8009-afc6-41986f5803...

Because apparently the model doesn't know about the actual verb (https://en.wiktionary.org/wiki/blueberry#English), it decides to treat the request as some kind of fantasy linguistics, making up its own definition on the fly. It provides grammatically incorrect examples inconsistent with the grammatically incorrect table of conjugations it generates next.