Comment by wizzwizz4

7 days ago

I've changed my mind: this isn't very good feedback, because you had to misrepresent what I wrote in order to criticise it.

I said "if porn sites had the kind of stuff": your paraphrase adds an implication I vehemently disagree with. The impersonal nature of a website (or magazine, or whatever) is important. Children shouldn't be looking at porn on social media sites, because they should have neither social nor parasocial relationships with sex workers qua sex workers (lumping amateurs in with professionals, here): this is a (non-central) special case of "adults should not have sexual relationships with children". We can't ignore the power dynamics.

That's one of the things I think the OSA got right: if you read between the lines, each measure does seem to be motivated by an actual problem, some of which aren't obvious to non-experts like me. I'd love to get access to the NSPCC's recommendations for the OSA, before it got translated to this awful implementation: that'd make it much easier to try to design alternative, more effectual implementations.

Note also, the queer/disabled techies I mentioned? They take pains to ensure that minors do not interact with them in a sexual context: some of them explain why, and others make blanket prohibitions without explanation. It is generally understood that consent and boundaries are respected. And, from what I can tell looking at public social graphs, this works: nobody I know to be a child is interacting with nudes, risqué posts, erotica, or accounts dedicated to that purpose, even if they're otherwise quite close in the social graph. (Maybe I should do a study? But analysing people's social graphs without their consent doesn't feel ethical. Perhaps interviews would be a better approach.)

There is the occasional post from a child (youngest I've observed was 16) complaining about these policies, because they think they don't need protection. That they're complaining, rather than just bypassing the technical barriers (as everybody in my school knew how to do), is perhaps another indication that this approach works.

(I'm a degree separated from the communities that post sexy stuff online, so my observations may not be representative of what actually happens. I'm also seeing the situation after moderation, a few minutes delayed due to federation latency: I know that "remove the consequences of a child's foolishness from the public sphere as quickly as possible" is a priority in online moderation, so this selection bias might be quite heavy. Further research is needed.)