Comment by yodon
6 months ago
Anyone who thinks their reading skills are a reliable detector of AI-generated content is either lying to themselves about the validity of their detector or missing the opportunity to print money by selling it.
I strongly suspect more people are in the first category than the second.
1) If someone had the reading skills to detect AI generated content wouldn't that technically be something very hard to monetize? It's not like said person could clone themselves or mass produce said skill.
Also, for a large number of AI generated images and text (especially low-effort), even basic reading/perception skills can detect AI content. I would agree though that people can't reliably discern high-effort AI generated works, especially if a human was involved to polish it up.
2) True—human "detectors" are mostly just gut feelings dressed up as certainty. And as AI improves, those feelings get less reliable. The real issue isn’t that people can detect AI, but that they’re overconfident when they think they can.
One of the above was generated by ChatGPT to reply to your comment. The other was written by me.
It's so obvious that I almost wonder if you made a parody of AI writing on purpose.