← Back to context

Comment by u12

13 days ago

> Meta required users to be caught 17 times attempting to traffic people for sex before it would remove them from its platform, which a document described as “a very, very, very high strike threshold." I don’t get it. Is sex trafficking driven user growth really so significant for Meta that they would have such a policy ?

The "catching" is probably some kind of automated detection scanner with an algo they don't fully trust to be accurate, so they have some number of "strikes" that will lead to a takedown.

There is always a complexity to this (and don't think I'm defending Meta, who are absolutely toxic).

Like Apple's "scanning for CSAM", and people said "Oh, there's a threshold so it won't false report, you have to have 25+ images (or whatever) before it will"... Like okay, avoid false reporting, but that policy is one messy story away from "Apple says it doesn't care about the first 24 CSAM images on your phone".

Of course it's not. We could speculate about how to square this with reason and Meta's denial; perhaps some flag associated with sex trafficking had to be hit 17 times, and some people thought the flag was associated with too many other things to lower the threshold. But the bottom line is that hostile characterizations of undisclosed documents aren't presumptively true.

We don’t know. But as you read from the article, Meta’s own employees were concerned about it (and many other things). For Zuck it was not a priority, as he said himself.

We can speculate. I think they just did not give a fuck. Usually limiting grooming and abuse of minors requires limiting the access of those minors to various activities on the platform, which means those kids go somewhere else. Meta specifically wanted to promote it’s use among children below 13 to stimulate growth, that often resulting in the platform becoming dangerous for minors was not seen as their problem.

If your company is driven by growth über alles à la venture capitalism, it will mean the growth goes before everything else. Including child safety.

  • Reading Careless People by Sarah Wynn Williams is eye opening here, and it's pretty close to exactly that.

    > I think they just did not give a fuck.

    It's that people like Zuck and Sandberg were just so happily ensconced in their happy little worlds of private jets and Davos and etc., that they really could not care less if it wasn't something that affected them (and really, the very vast majority of issues facing Meta, don't affect them, only their bonuses and compensation).

    Your actions will lead to active harm? "But not to me, so, so what, if it helps our numbers".