Comment by haritha-j
16 days ago
> and will see content filters for any content Discord detects as graphic or sensitive.
I didn't even realise discord scans all the images that i send and recieve.
16 days ago
> and will see content filters for any content Discord detects as graphic or sensitive.
I didn't even realise discord scans all the images that i send and recieve.
Really I've come to the conclusion that anything I send out of my LAN is probably kept on a server forever and ingested by LLMs, and indexed to be used against me in perpetuity at this point, regardless of what any terms or conditions of the site I'm using actually says.
Speaking of hosting, Discord used to be one of the biggest (inadvertent) image hosts, so they might have set up the system to reduce legal exposure than to monitor conversations per se.[1]
A lot of the internet broke the day they flipped that switch off.
Weren't external Tumblr hotlinks also a thing back in the day?
[1]: https://www.reddit.com/r/discordapp/comments/16uy0an/not_sur...
To be fair, the terms and conditions probably say that they can do whatever they want with that data :-).
Don’t forget all the government creeps snooping on the wires.
Until the current administration, I was much more bothered by private misuse/abuse of date than the government. Now I worry about both.
5 replies →
Pretty much every non-E2EE platform is scanning every uploaded image for CSAM at least, that's a baseline ass-covering measure.
And E2EE platforms like Mega are now being censored on some platforms specifically because they're E2EE, and so the name itself must be treated as CSAM.
As people who want to talk about words like "megabytes" or "megapixels" or "megaphones" or "Megaman" or "Megan" on Facebook are finding out.
Well it's not E2EE, so what did you expect? Nothing you do on Discord is private, everything is screened, categorized and readable by third parties.
They have to at least for CSAM.
Everything that is not end-to-end encrypted understandably has to do it.