Comment by janderland
10 hours ago
This is a ridiculous analogy. Test the app. Read its source code. Developers could always write toxic instruction in your tools. AI may write inefficient or messy code, but it’s far from nefarious. “Asbestos” code is written intentionally by humans, not unintentionally AI.
That's a good way to guarantee nobody will use it. Who is going to test the app in a sandbox with godknowswhat kind of tooling needed to find malicious behavior and read the code? For a tool that's convenient once per decade?
At no point ever in history could you guarantee that third party code downloaded from the internet was not malicious without some sort of security review.
Software security assessments exist for this very purpose. You may personally lack the rigor to do this at home but those who have rigorous security processes absolutely do implement security reviews.
There is a whole industry of professionals who do this work.
Nobody, and that's my point. 99% of people going to install the tool and never bother with the source. This was true before AI and is still true now.