Comment by OJFord
4 years ago
You're not missing something, but you're not likely to get real examples because as I understand it the algorithm and database are private, the posters above are just guardedly commenting with (claimed) insider knowledge, they're not likely to want to leak examples (and not just that it's private, but with the supposed contents.. Would you really want to be the one saying 'but it isn't, look'? Would you trust someone who did, and follow such a link to see for yourself?)
To be clear, I definitely didn't want examples in terms of links to the actual content. Just a general description. Like, was a beach ball misclassified as a heinous crime, or was it perfectly legal consensual porn with adults that was misclassified, or was it something that even a human could potentially mistake for CSAM. Or something else entirely.
I understand it seems like they don't want to give examples, perhaps due to professional or legal reasons, and I can respect that. But I also think that information is very important if they're trying to argue a side of the debate.
> Just a general description.
I gave that above in a sibling thread.
> I understand it seems like they don't want to give examples, perhaps due to professional or legal reasons, and I can respect that.
In my case, it’s been 7 years so I’m not confident enough of my memory to give a detail description of each false positive. All I can say is that the photos that were false positive that included people were either very obviously fully clothed and doing something normal, or the photo was of something completely innocuous all together (I seem to remember an example of the latter was the Windows XP green field stock desktop wallpaper, but I’m not positive on that).