Comment by spicymaki
7 hours ago
I think what many people do no understand is that software development is communication. Communication from the customers/stake holders to the developer and communication from with the developer to the machine. At some fundamental level there needs to be some precision about what you want and someone/something needs to translate that into a system to provide that solution. Software can help check if there are errors, check constraints, and execute instructions precisely, but they cannot replace the fact that someone needs to tell the machine what to do (precise intent).
What AI (LLMs) do is raises the level of abstraction to human language via translation. The problem is human language is imprecise in general. You can see this with legal or science writing. Legalese is almost illegible to laypeople because there are precise things you need to specify and you need be precise in how you specify it. Unfortunately the tech community is misleading the public and telling laypeople they can just sit back and casually tell AI what you want and it is going to give you exactly what you wanted. Users are just lying to themself, because most-likely they did not take the time to think through what they wanted and they are rationalizing (after the fact) that the AI is giving them exactly what they wanted.
No comments yet
Contribute on Hacker News ↗