Comment by towledev
7 days ago
>Instruct them in how you want them to code
They don't always listen.
Writing SQL, I'll give ChatGPT the schema for 5 different tables. It habitually generates solutions with columns that don't exist. So, naturally, I append, "By the way, TableA has no column FieldB." Then it just imagines a different one. Or, I'll say, "Do not generate a solution with any table-col pair not provided above." It doesn't listen to that at all.
I haven't had that problem with Gemini 2.5 pro or O3, are you on the free tier of ChatGPT?
You do understand that these models are not sentient and are subject to hundreds of internal prompts, weights, and a training set right?
They can’t generate knowledge that isn’t in their corpus and the act of prompting (yes, even with agents ffs) is more akin to playing pachinko than it is pool?
This is something that people working on extremely simple apps don’t understand because for their purposes it looks like magic.
If you know what you’re doing and you’re trying to achieve something other than the same tutorials that have been pasted all over the internet the non-deterministic pattern machine is going to generate plausible bs.
They’ll tell you any number of things that you’re supposedly doing wrong without understanding what the machine is actually doing under the hood.