Comment by acka
1 month ago
Easy, consider a parent taking pictures of their kid's genitals to send to their doctor to investigate a medical condition, the pictures getting flagged and reported to the authorities as being child pornography by an automated enforcement algorithm, leading to a 10-month criminal investigation of the parent. This exact thing happened with Google's algorithm using AI to hunt for CP[1], so it isn't hard to imagine that it could happen with Apple software, too.
[1] https://www.koffellaw.com/blog/google-ai-technology-flags-da...
Good example. I think this is worth the tradeoff