← Back to context

Comment by panarchy

7 months ago

Interesting those instructions sound like the exact opposite of what I want from an AI. Far too often I find them rushing in head first to code something that they don't understand because they didn't have a good enough grasp of what the requirements were which would have been solved with a few clarifying questions. Maybe it just tries to do the opposite of what the user wants.

I don't have any particular insider knowledge, and I'm on the record of being pretty cynical about AI so far

That said, I would hazard a guess here that they don't want the AI asking clarifying questions for a number of possible reasons

Maybe when it is allowed to ask questions it consistently asks poor questions that illustrate that it is bad at "thinking"

Maybe when it is allowed to ask questions they discovered that it annoys many users who would prefer it to just read their minds

Or maybe the people who built it have massive egos and hate being questioned so they tuned it so it doesn't

I'm sure there are other potential reasons, these just came to mind off the top of my head

  • I bet it has to do with efficient UX experience. Most of the users most of the time want to get the best possible answer from the prompt they have provided straight away. If they need to clarify, they respond with an additional prompt but at any time they can just use what was provided and stop the conversation. Even for simple tasks there’s a lot of room for clarification which would just slow you down most of the time and waste server resources.