Comment by cooper_ganglia
7 days ago
“Queer/Disabled techies post porn that I think is good for kids, which is great because otherwise children would have to just use PornHub” is a GREAT ideology to viscerally radicalize the majority of people against you AND the people you’re speaking about.
There's a difference between "good" and "not harmful". I would not encourage children to watch porn (if it came up in conversation, I'd dissuade them or change the subject); however, it's a fact that they do – to the point my peers did not believe me when I told them I didn't. There is such a thing as harm reduction, and there's a point past which "teaching children that their feelings are not harmful nor wrong" is more important than the veneer of propriety.
But, noted. That's excellent feedback.
To steal your wording from https://news.ycombinator.com/item?id=44728577: if some children are going to seek out porn no matter what we do, better for the first thing they find not to be "content that demeans women and distorts their worldview on sex and relationships". If the Online Safety Act effectively prevented children from being exposed to that, then I would be ambivalent about it – but the law clearly won't achieve any of its stated goals. (I suspect bad porn is clearly-defined enough to prohibit directly, with cigarette-style prohibitions on making attractive to children sufficient for the respectful stuff, but I expect many people to call any ban "draconian".)
I've changed my mind: this isn't very good feedback, because you had to misrepresent what I wrote in order to criticise it.
I said "if porn sites had the kind of stuff": your paraphrase adds an implication I vehemently disagree with. The impersonal nature of a website (or magazine, or whatever) is important. Children shouldn't be looking at porn on social media sites, because they should have neither social nor parasocial relationships with sex workers qua sex workers (lumping amateurs in with professionals, here): this is a (non-central) special case of "adults should not have sexual relationships with children". We can't ignore the power dynamics.
That's one of the things I think the OSA got right: if you read between the lines, each measure does seem to be motivated by an actual problem, some of which aren't obvious to non-experts like me. I'd love to get access to the NSPCC's recommendations for the OSA, before it got translated to this awful implementation: that'd make it much easier to try to design alternative, more effectual implementations.
Note also, the queer/disabled techies I mentioned? They take pains to ensure that minors do not interact with them in a sexual context: some of them explain why, and others make blanket prohibitions without explanation. It is generally understood that consent and boundaries are respected. And, from what I can tell looking at public social graphs, this works: nobody I know to be a child is interacting with nudes, risqué posts, erotica, or accounts dedicated to that purpose, even if they're otherwise quite close in the social graph. (Maybe I should do a study? But analysing people's social graphs without their consent doesn't feel ethical. Perhaps interviews would be a better approach.)
There is the occasional post from a child (youngest I've observed was 16) complaining about these policies, because they think they don't need protection. That they're complaining, rather than just bypassing the technical barriers (as everybody in my school knew how to do), is perhaps another indication that this approach works.
(I'm a degree separated from the communities that post sexy stuff online, so my observations may not be representative of what actually happens. I'm also seeing the situation after moderation, a few minutes delayed due to federation latency: I know that "remove the consequences of a child's foolishness from the public sphere as quickly as possible" is a priority in online moderation, so this selection bias might be quite heavy. Further research is needed.)