← Back to context

Comment by kspacewalk2

4 hours ago

The fundamental question that needs answering is: should we actually prevent minors below the age of X from accessing social media site Y? Is the harm done significant enough to warrant providing parents with a technical solution for giving them control over which sites their X-aged child signs up, and a solution that like actually works? Obviously pinky-swear "over 13?" checkboxes don't work, so this currently does not exist.

You can work through robustness issues like the one you bring up (photo uploading may not be a good method), we can discuss privacy trade-offs like adults without pretending this is the first time we legitimately need to make a privacy-functionality or privacy-societal need trade-off, etc. Heck, you can come up with various methods where not much privacy needs trading off, something pseudonymous and/or cryptographic and/or legislated OS-level device flags checked on signup and login.

But it makes no sense to jump to the minutiae without addressing the fundamental question.

> The fundamental question that needs answering is: should we actually prevent minors below the age of X from accessing social media site Y?

I suspect if you ask Hacker News commenters if we should put up any obstacles to accessing social media sites for anyone, a lot of people will tell you yes. The details don't matter. Bashing "social media" is popular here and anything that makes it harder for other people to use is viewed as a good thing.

What I've found to be more enlightening is to ask people if they'd be willing to accept the same limitations on Hacker News: Would they submit to ID review to prove they aren't a minor just to comment here? Or upvote? Or even access the algorithmic feed of user-generated content and comments? There's a lot of insistence that Hacker News would get an exception or doesn't count as social media under their ideal law, but in practice a site this large with user-generated content would likely need to adhere to the same laws.

So a better question might be: Would you be willing to submit to ID verification for the sites you participate in, as a fundamentally good thing for protecting minors from bad content on the internet?

  • > Would you be willing to submit to ID verification for the sites you participate in, as a fundamentally good thing for protecting minors from bad content on the internet?

    The friction would be sufficient to give up. Arguably no loss to me and certainly none to the internet.

    This is what has happened already, I am not giving my id to some shitty online provider. If I lose more sites so be it.

  • > The details don't matter.

    The details very much DO matter.

    You can look at all manner of posts here on HN that explain exactly how you should do age verification without uploading IDs or giving central authority to some untrustworthy entity.

    The fact that neither the governments proposing these laws nor the social media sites want to implement them those ways tells you that what these entities want isn't "verification" but "control".

    And, yes, most of us object to that.

    • > You can look at all manner of posts here on HN that explain exactly how you should do age verification without uploading IDs or giving central authority to some untrustworthy entity.

      That's not how ID verification works. The ID verification requirements are about associating the person logging in with the specific ID.

      So kids borrow their parents' ID while they're not looking, complete the registration process that reveals nothing, then they're good forever.

      Or in the scenario where nothing at all is revealed about the ID and there is no central authority managing rate limiting, all it takes is for a single ID to be compromised and then everyone can use it to authenticate everywhere forever.

      That's why all of the age verification proposals are basically ID verification proposals. All of these anonymous crypto suggestions wouldn't satisfy those requirements.

> The fundamental question that needs answering is: should we actually prevent minors below the age of X from accessing social media site Y?

This is only an interesting question if we can prevent it. We couldn't prevent minors from smoking, and that was in a world where you had to physically walk into a store to buy cigarettes. The internet is even more anonymous, remote-controlled, and wild-west. What makes us think we can actually effectively age gate the Internet, where even Nobody Knows You're A Dog (1993)[1].

1: https://en.wikipedia.org/wiki/On_the_Internet,_nobody_knows_...

  • > We couldn't prevent minors from smoking,

    Smoking rates among minors have plummeted and continue to decline.

    That's not really a good example because the war on underage smoking has been a resounding success.

    Yeah we didn't stop every single minor everywhere from ever smoking at any time, but the decline was dramatic.

    • I'd argue that the reduction of underage smoking has much more to do with things like social acceptability and education about the dangers of smoking, and not about physical controls on the distribution of and access to cigarettes. There also appears to be a recent trend of younger people not drinking alcohol to the extent that my generation and Boomers did, which is wonderful, but probably has nothing to do with physical access to beer.

      This is the right way to reduce childhood social media use: Make it socially disgusting, and make it widely known to be dangerous.

The real solution, IMO, is a second internet. Domain names will be whitelisted, not blacklisted, and you must submit an application to some body or something.

  • I agree. There were attempts to do something like this with porn sites via the .xxx TLD I believe, but that inverts the problem. Don't force the public to go to a dark alley for their guilty pleasures. Instead, the sites that want to target kids need to be allowlisted. That is much more practical and palatable.

    • Yeah.. the opposition was just a bad take IMO... "but it will create a virtual red light district" which is EXACTLY what you want online, unlike a physical city, you aren't going to accidentally take a wrong turn, and if you're blocking *.xxx then it's even easier to avoid.

      Then require all nudity to be on a .edu, .art or .xxx, problem mostly solved.

      2 replies →

  • I dont see why phones can't come with a browser that does this. Parents could curate a whitelist like people curate playlists, and share it, and the browser would honor that.

    Combined with some blacklisted apps (e.g., all other browsers), this would be a passable opt-in solution. I'm sure there's either a subscription or a small incentive for someone to build this that hopefully isn't "Scam children".

    It's not like kids are using PCs, and if they use someone else's phone, that's at least a severely limiting factor.

    • They do, don’t they? Apple devices have had a robust whitelisting/blacklisting feature for at least a couple of years. I use it to block websites and apps to lessen my phone addiction. I’m sure Android offers similar features

      1 reply →

Nice job of sidestepping the "fundamental question" of whether that can be done and what damage it would do. You do not get to answer the question as you posed it in a vacuum.

It's not a "robustness issue". Nobody has proposed anything that works at all.

But to answer your "fundamental question", no. Age gating is dumb. Giving parents total control is also dumb.

Can we actually prevent children under 16 from buying beer?

  • If they are persistent enough, no. But then everyone knows it's not going to stop every child in every situation. It sets a president for what society thinks is a sensible limit though, and society raises children not just individual families or parents.

    Do we want kids becoming alcoholics? Do we want them turning up drunk to school and disrupting classes? Do we want to give parents trying to do the right thing some backup? So they know that when their kid is alone they can expect that other adults set a similar example.

    Sure, you can't stop a kid determined to consume alcohol. But I think the societal norm is an overall good thing.

    The same should be applied to the online space, kids spend more and more time there. Porn, social media, gambling etc. should be just a much of a concern as alcohol.

  • Is there actually a difference between transactions between humans in meatspace (getting a government ID, then using it at a store) and age estimation algorithms?

It's never been about porn. By marking certain part of the internet "adult-only" you imply that the rest is "family-friendly" and parents can feel less bad about themselves leaving their children with iPads rather than actually parenting them, which is exactly what Big Tech wants for obvious reasons. If I had a child I'd rather have it watch porn than Cocomelon, which has been scientifically developed so that it turns your child's brain into seedless raspberry jam. Yet nobody's talking about the dangers of that, because everyone's occupied with <gasp> titties.

  • > If I had a child I'd rather have it watch porn than Cocomelon

    As a parent that regularly fears who my children will encounter in the world, I’m glad there’s an “if” at the beginning of this sentence.

    • Don't worry, most likely your children will come across the normal sorts of bad people - cheating partners, bullying peers, abusive bosses, rude customers, lying beggars, maybe robbers and thieves. It's fortunately unlikely they'll meet a guy who is outspoken about his opinion that scientifically capturing people's attention to get them addicted to screens is morally much worse than showing them "penis into vagina episode 74786". We don't want their innocent minds to be poisoned with ideas that question the status quo.