Comment by mystraline
4 hours ago
This matters greatly if you want to self-host something like Matrix, and you permit federation..
You WILL get a CSAM spam issue. It will get caught in your server cache. And you won't catch it until after the fact. And shit admin tools will not properly remove the spammer or content.
Better yet, if you run Matrix, disable image caching and preloading.
Additionally, if you provide any service that offers image diffusion as an offering. You WILL get CSAM* being generated. Make sure you set up multiple layers to catch this. I built out Figma's safety pipeline and procedures for generated content. You'd be amazed what people try and make.
* Not going to debate whether or not AI imagery is CSAM here, but the point being you'll get users trying to generate ai images with subjects < 18yrs old.