← Back to context

Comment by qingcharles

2 days ago

A lot of the time you see in its "Thinking" it will say things like "The user asked me to create X, but that isn't possible due to Y, or would be less than ideal, so I will present the user with a more fitting solution."

Most of the time, with the latest models, in my experience the AI picks up what I am doing wrong and pushes me in the right direction. This is with the new models (o3, C4, Grok4 etc). The older non-thinking ones did not do this.

In my case, there is no wrong or impossible direction, just a technical detail that you realise you must overcome when you start to code and that I doubt the model will be able to solve on its own. What it should do is start coding, realise the difficulty, and then ask me how to solve it. Do those agents do that kind of thing yet? Mind you, I'm not interested in the code, only in the question that writing the code would allow a programmer to ask.