Comment by sosodev
2 days ago
He asked the models to fix the problem without commentary and then… praised the models that returned commentary. GPT-5 did exactly what he asked. It doesn’t matter if it’s right or not. It’s the essence of garbage in and garbage out.
If they are supposed to replace actual devs we would expect them to behave like actual devs and push back against impossible requests.
Except it's not an impossible request. If my manager told me "fix this code with no questions asked" I would produce a similar result. If you want it to push back, you can just ask it to do that or at least not forbid it to. Unless you really want a model that doesn't follow instructions?