Comment by jjcm

6 hours ago

Additionally, if you provide any service that offers image diffusion as an offering. You WILL get CSAM* being generated. Make sure you set up multiple layers to catch this. I built out Figma's safety pipeline and procedures for generated content. You'd be amazed what people try and make.

* Not going to debate whether or not AI imagery is CSAM here, but the point being you'll get users trying to generate ai images with subjects < 18yrs old.