Comment by cosmos0072
1 day ago
I am from EU, and contrary to age verification laws in general.
My stance is that if somebody is a minor, his/her/their parents/tutors/legal guardian are responsible for what they can/cannot do online, and that the mechanism to enforce that is parental control on devices.
Having said that, open-source zero-knowledge proofs are infinitely less evil (I refuse to say "better") than commercial cloud-based age monitoring baked into every OS
> Having said that, open-source zero-knowledge proofs are infinitely less evil (I refuse to say "better") than commercial cloud-based age monitoring baked into every OS
To be honest, I worry that the framing of this legislation and ZKP generally presents a false dichotomy, where second-option bias[1] prevails because of the draconian first option.
There's always another option: don't implement age verification laws at all.
App and website developers shouldn't be burdened with extra costly liability to make sure someone's kids don't read a curse word, parents can use the plethora of parental controls on the market if they're that worried.
[1] https://rationalwiki.org/wiki/Appeal_to_the_minority#Second-...
> App and website developers shouldn't be burdened with extra costly liability
Why not? Physical businesses have liability if they provide age restricted items to children. As far as I know, strip clubs are liable for who enters. Selling alcohol to a child carries personal criminal liability for store clerks. Assuming society decides to restrict something from children, why should online businesses be exempt?
On who should be responsible, parents or businesses, historically the answer has been both. Parents have decision making authority. Businesses must not undermine that by providing service to minors.
> Why not?
This implies the creation of an infrastructure for the total surveillance of citizens, unlike age verification by physical businesses.
14 replies →
> Physical businesses have liability if they provide age restricted items to children.
Ok, suppose the strip club is the website, and the club's door is the OS.
Would you fine the door's manufacturer for teens getting into the strip club?
3 replies →
> Physical businesses have liability if they provide age restricted items to children.
These are often clear cut. They're physical controlled items. Tobacco, alcohol, guns, physical porn, and sometimes things like spray paint.
The internet is not. There are people who believe discussions about human sexuality (ie "how do I know if I'm gay?") should be age restricted. There are people who believe any discussion about the human form should be age restricted. What about discussions of other forms of government? Plenty would prefer their children not be able to learn about communism from anywhere other than the Victims of Communism Memorial Foundation.
The landscape of age restricting information is infinitely more complex than age restricting physical items. This complexity enables certain actors to censor wide swaths of information due to a provider's fear of liability.
This is closer to a law that says "if a store sells an item that is used to damage property whatsoever, they are liable", so now the store owner must fear the full can of soda could be used to break a window.
7 replies →
> Physical businesses
Physical businesses nominally aren't selling their items to people across state or country borders.
Of course, we threw that out when we decided people could buy things online. How'd that tax loophole turn out?
1 reply →
For one thing, it's fairly uncommon for children to purchase operating systems. As long as there is one major operating system with age verification, parents (or teachers) who want software restrictions on their children can simply provide that one. The existence of operating systems without age verification does not actually create a problem as long as the parents are at least somewhat aware of what is installed at device level on their child's computer, which is an awful lot easier than policing every single webpage the kid visits.
4 replies →
App and website developers shouldn't be burdened with extra costly liability to make sure someone's kids don't read a curse word, parents can use the plethora of parental controls on the market if they're that worried.
App and website operators should add one static header. [1] That's it, nothing more. Site operators could do this in their sleep.
User-agents must look for said header [1] and activate parental controls if they were enabled on the device by a parent. That's it, nothing more. No signalling to a website, no leaking data, no tracking, no identifying. A junior developer could do this in their sleep.
None of this will happen of course as bribery (lobbying) is involved.
[1] - https://news.ycombinator.com/item?id=46152074
Practically, instead of requiring that sites verify age, require that they serve adult content with standardized headers. Devices can then be marketed as "child-safe" which refuse to display content with such headers.
ZKP methods are just as draconian as they rely on locking down end user devices with remote attestation, which is why they're being pushed by Google ("Safety" net, WEI, etc).
The real answer to the problem is for websites/appstores to publish tags that are legally binding assertions of age appropriateness, and then browsers/systems can be configured to use those tags to only show appropriate content to their intended user.
This also gives parents the ability to additionally decide other types of websites are not suitable for their children, rather than trusting websites themselves to make that decision within the context of their regulatory capture. For example imagine a Facebook4Kidz website that vets posts as being age appropriate, but does nothing to alleviate the dopamine drip mechanics.
There has been a market failure here, so it wouldn't be unreasonable for legislation to dictate that large websites must implement these tags (over a certain number of users), and that popular mobile operating systems / browsers implement the parental controls functionality. But there would be no need to cover all websites and operating systems - untagged websites fail as unavailable in the kid-appropriate browsers, and parents would only give devices with parental controls enabled to their kids.
> The real answer to the problem is for websites/appstores to publish tags that are legally binding assertions of age appropriateness, and then browsers/systems can be configured to use those tags to only show appropriate content to their intended user.
Agreed, recycling a comment: on reasons for it to be that way:
___________
1. Most of the dollar costs of making it all happen will be paid by the people who actually need/use the feature.
2. No toxic Orwellian panopticon.
3. Key enforcement falls into a realm non-technical parents can actually observe and act upon: What device is little Timmy holding?
4. Every site in the world will not need a monthly update to handle Elbonia's rite of manhood on the 17th lunar year to make it permitted to see bare ankles. Instead, parents of that region/religion can download their own damn plugin.
1 reply →
> There's always another option: don't implement age verification laws at all.
Where do you go to vote for this option?
The concern is ubiquitous all-pervasive surveillance, control, and manipulation of algorithmical social media and its objective consequences for child development and well-being. Not "kids reading a bad word". Disagree all you want, but don't twist the premise.
Surely you can find a rationalwiki article for your fallacy too.
If you want to avoid all pervasive surveillance, it might be wise to not mandate all pervasive surveillance in the OS by law.
In fact, I suspect adults, and not just children, would also appreciate it if the pervasive surveillance was simply banned, instead of trying to age gate it. Why should bad actors be allowed to prey on adults?
4 replies →
>Disagree all you want, but don't twist the premise.
The 2 billion dollars are the one twisting it.
You mean the same social media companies that want this legislation and wrote it themselves? The same legislation that introduces more surveillance and tracking for everyone, including kids?
Also, I heard the same thing about video games, TV shows, D&D, texting and even youth novels. It's yet another moral panic.
From the Guardian[1]:
> Social media time does not increase teenagers’ mental health problems – study
> Research finds no evidence heavier social media use or more gaming increases symptoms of anxiety or depression
> Screen time spent gaming or on social media does not cause mental health problems in teenagers, according to a large-scale study.
> With ministers in the UK considering whether to follow Australia’s example by banning social media use for under-16s, the findings challenge concerns that long periods spent gaming or scrolling TikTok or Instagram are driving an increase in teenagers’ depression, anxiety and other mental health conditions.
> Researchers at the University of Manchester followed 25,000 11- to 14-year-olds over three school years, tracking their self-reported social media habits, gaming frequency and emotional difficulties to find out whether technology use genuinely predicted later mental health difficulties.
From Nature[2]:
> Time spent on social media among the least influential factors in adolescent mental health
From the Atlantic[3] with citations in the article:
> The Panic Over Smartphones Doesn’t Help Teens, It may only make things worse.
> I am a developmental psychologist[4], and for the past 20 years, I have worked to identify how children develop mental illnesses. Since 2008, I have studied 10-to-15-year-olds using their mobile phones, with the goal of testing how a wide range of their daily experiences, including their digital-technology use, influences their mental health. My colleagues and I have repeatedly failed to find[5] compelling support for the claim that digital-technology use is a major contributor to adolescent depression and other mental-health symptoms.
> Many other researchers have found the same[6]. In fact, a recent[6] study and a review of research[7] on social media and depression concluded that social media is one of the least influential factors in predicting adolescents’ mental health. The most influential factors include a family history of mental disorder; early exposure to adversity, such as violence and discrimination; and school- and family-related stressors, among others. At the end of last year, the National Academies of Sciences, Engineering, and Medicine released a report[8] concluding, “Available research that links social media to health shows small effects and weak associations, which may be influenced by a combination of good and bad experiences. Contrary to the current cultural narrative that social media is universally harmful to adolescents, the reality is more complicated.”
[1] https://www.theguardian.com/media/2026/jan/14/social-media-t...
[2] https://www.nature.com/articles/s44220-023-00063-7
[3] https://www.theatlantic.com/technology/archive/2024/05/candi...
[4] https://adaptlab.org/
[5] https://pubmed.ncbi.nlm.nih.gov/31929951/
[6] https://www.nature.com/articles/s44220-023-00063-7#:~:text=G...
[7] https://pubmed.ncbi.nlm.nih.gov/32734903/
[8] https://nap.nationalacademies.org/resource/27396/Highlights_...
Yes! This is the way, give parents the ABILITY to advertise the users age to browsers, apps and everything in between. Only target cooperations, do not target open source projects. Fine websites for not using this API (ex: porn sites). Assume an adult if not present.
> Fine websites for not using this API (ex: porn sites).
Recent posters here are clear that porn sites are setting every available signal that they are serving adult-only content.
According to them, you are targeting the wrong audience.
Facebook/Instagram studying how to get young users addicted should be of greater concern. I have my doubts about the effectiveness of age-based blocking there, though.
> Facebook/Instagram studying how to get young users addicted should be of greater concern. I have my doubts about the effectiveness of age-based blocking there, though.
Yeah quite the opposite. Once they have that formalized attestation they will move in like sharks.
Both are problems, porn sites have also targeted children and any non-enforced age “verification” on these sites is simply plausible deniability that isn’t plausible at all
1 reply →
No. This is not the way.
> give parents the ABILITY to advertise the users age to browsers, apps and everything in between.
Accounts and Applications to services that provide countent are set to a country-specific age rating restrictions (PG, 12+, 18+, whatever). That's it.
None of the things you mentioned have any point to concern themself with the age or age-bracket of the user in front of the device. This can and will be abused. This is very obvious. Think about it.
Why should the applications get to decide if they are appropriate for a particular age? Shouldn't that be up to the parent? I shouldn't need to tell my kid: "Well, to use this compiler software, you need to set your age to 18 temporarily, because some product manager 3,000 miles away decided to rate it 18+. But, set it back to age 13 afterwards because you shouldn't be on adult sites." It's stupid.
That is what I meant by age(-rating), you are correct. However, drop country specifics - too complicated. Age brackets are enough: child, preteen, teen, adult. At around 16-17 these should be dropped anyway since at that point people are smart enough to get around these measures anyway and usually have non-parent controlled devices.
This is a great solution to the stated problem. The issue is that nobody is actually trying to solve the stated problem. This is a terrible solution to the real 'problem' which is the lack of surveillance power and information control.
>This is a great solution to the stated problem. The issue is that nobody is actually trying to solve the stated problem. This is a terrible solution to the real 'problem' which is the lack of surveillance power and information control.
So on the Sony consoles I created an account for my child and guess what they have implemented some stuff to block children from adult content on some stuff.
So if Big Tech would actually want to prevent laws to be created could make it easy for a parent to setup the account for a child (most children this days have mobile stuff and consoles so they could start with those), we just need the browsers to read the age flag from the OS and put it in a header, then the websites owners can respect that flag.
I know that someone would say that some clever teen would crack their locked down windows/linux to change the flag but this is a super rare case, we should start with the 99% cases, mobile phones and consoles are already locked down so an OS API that tells the browser if this is an child account and a browser header would solve the issue, most porn websites or similar adult sites would have no reason not to respect this header , it would make their job easier then say Steam having to always popup a birth date thing when a game is mature.
7 replies →
Three states now implement this solution that you just called a great solution, and most of HN still hates it. Are they seeing something that you're not? https://news.ycombinator.com/item?id=47357294
2 replies →
This is what I think. I saw someone else on HN suggested provide an `X-User-Age` header to these sites, and provide parents with a password protected page to set that in the browser/OS.
Responsibility should be on the website to not provide the content if the header is sent with an inappropriate age, and for the parent to set it up on the device, or to not provide a child a device without child-safe restrictions.
It seems very obviously simple to me, and I don't see why any of these other systems have gained steam everywhere all of a sudden (apart from a desire to enhance tracking).
Seems simple until you try to figure out what's allowed for what age, which surely will differ by country at a minimum.
that is correct the parents are meant to pass on morals and parent the child. If the parents fall through, there is the community such as church, neighbors, schools etc. The absolute last resort is government or law enforcement intervention, and this should be considered an extreme situation. But as John Adams noted, "Our Constitution was made only for a moral and religious people" -- in other words, all these laws start to rip at the seams when the fabric of society, the people who make up the society no longer have morals. But I appreciate this article in general, we need to fight against mass surveilance at all costs.
>all these laws start to rip at the seams when the fabric of society, the people who make up the society no longer have morals
Morals like owning slaves, right?
A moral system that requires everyone to be white Christian males isn't a moral system, it's a theocracy.
"mechanism to enforce that is parental control on devices."
Meh, I use it, but it's super annoying and I think that with my Daughter I'll take a different approach (but it will be some years before that is relevant).
On Android: The kid can easily go on Snapchat (after approval of install of course, and then you can just see their "friends") before Pokemon Go (just a pain to get working, it keeps presenting some borked version which led to a lot of confusion at first). I just lied about his age in a bunch of places at some point. Snapchat is horrible and sick from our experiences in the first week.
On Windows: It's a curated set of websites (and no FireFox) or access to everything. It's not even workable for just school. Granting kids access to our own minercraft servers: My god, I felt dirty about what the other parents had to go through to enable that.
> Granting kids access to our own minercraft servers: My god, I felt dirty about what the other parents had to go through to enable that.
This is a hobby horse of mine to the point that coworkers probably wish I'd just stfu about Minecraft - but holy shit is it crazy how many different things you need to get right to get kids playing together.
I genuinely have no idea how parents without years of "navigating technical bullshit" experience ever manage to make it happen. Juggling Microsoft accounts, Nintendo accounts, menu-diving through one of 37 different account details pages , Xbox accounts, GamePass subscriptions - it's just fucking crazy!
I always wonder about this. I read most dialogs (as I do) but man, the sanity of most people must require that they just next next next this stuff right? Perhaps they even let their kids do it instead.
2 replies →
> My stance is that if somebody is a minor, his/her/their parents/tutors/legal guardian are responsible for what they can/cannot do online
As a parent, sure, that is my stance as well. What... what other stances are there even? How would they work?
The steelman argument is that parents are not necessarily up to date on the technology, and cannot reasonably be expected to supervise teenagers 24/7 up to the age of 18. Compare movie ratings or alcohol laws, for example: there's a non-parental obligation on third parties not to provide alcohol to children or let them in to R18 showings.
But the implementation matters, and almost all of these bills internationally are being done in bad faith by coordinated big-money groups against technologically illiterate and reactionary populist governments.
(if we really want to get into an argument, there's what the UK calls "Gillick competence": the ability of children to seek medical treatment without the knowledge and against the will of their parents)
In the UK parents can give children alcohol below the age of 18. parents get to make the final decision at home so I do not think its really comparable.
I would personally favour allowing parents to buy drinks for children below the current limits (18 without a meal, 16 for wine, beer and cider with a meal).
The alternative to this is empowering parents by regulating SIM cards (child safe cards already exist) and allowing parents to control internet connectivity either through the ISP or at the router - far better than regulating general purpose devices. The devices come with sensible defaults that parents can change.
1 reply →
That steelman still stands on a core assumption that its both the state's responsibility and right to step in and parent on everyone's behalf.
Maybe a majority of people today agree with that, but I know I don't and I never hear that assumption debated directly.
9 replies →
The other stance is that most parents are not capable of winning a battle against tech giants for the mind of their children, just as parents were not capable of winning this fight with tobacco and alcohol companies.
The tech giants want this. They drafted the bill. They paid tens of millions of dollars to promote it. Think about that for a minute.
1 reply →
If this had anything to do with reigning in tech giants, it would be done for adults as well, without restricting anyone's rights (well, aside from the people-corporations' of course). The issues are the manipulative algorithmic datafeeds, advertising, and datamining. Age verification does nothing for any of this and only provides the tech giants and governments the means to secure even more control over people.
ignore parent, outsource parenting to gov verification authority
TBH many parents done exactly that by giving phones/tablet already to kids in strollers
The latter is true, but we cannot regulate the vast majority of parents on the basis of the worst.
You could make the same case for parental control as evil.
"You‘re reading about evolution! Not in my house"
Parents already have a lot of control on children' education.
Examples: most children believe in the same religion as their parents, and can visit friends and places only if/when allowed by their parents.
This is simply extending the same level of control to the internet.
Government-mandated restrictions are completely another level.
I have personally worked with parents trying to prevent their children from using social media and it’s nearly impossible. Kids are almost always more tech savvy than their parents and unlike smoking it’s nearly impossible to tell a child is doing so without watching them 100% of the time.
Who controls your age if you try to buy alcohol.
Who controls your age if you want to see an R-rated movie?
This is simply extending the same level of control to the internet.
More control for parents is a completely different level.
10 replies →
> My stance is that if somebody is a minor, his/her/their parents/tutors/legal guardian are responsible for what they can/cannot do online, and that the mechanism to enforce that is parental control on devices.
Imho there is a place for regulation in that, actually. Devices that parents are managing as child devices could include an OS API and browser HTTP header for "hey is this a child?" These devices are functionally adminned by the parent so the owner of the device is still in control, just not the user.
Just like the cookie thing - these things should all be HTTP headers.
"This site is requesting your something, do you want to send it?
Y/N [X] remember my choice."
Do that for GPS, browser fingerprint, off-domain tracking cookies (not the stupid cookie banner), adulthood information, etc.
It would be perfectly reasonable for the EU to legislate that. "OS and browsers are required to offer an API to expose age verification status of the client, and the device is required to let an administrative user set it, and provide instructions to parents on how to lock down a device such that their child user's device will be marked as a child without the ability for the child to change it".
Either way, though, I'm far more worried about children being radicalized online by political extremists than I am about them occasionally seeing a penis. And a lot of radicalizing content is not considered "adult".
Same here, EU citizen who thinks parents should do some parenting, after all. However, try to confront "modern" parents with your position. Many of them will fight you immediately, because they think the state is supposed to do their work... Its a very concerning development.
I'll go further. As a human being, I am responsible for myself. I grew up in an extremely abusive, impoverished, cult-like religious home where anything not approved by White Jesus was disallowed.
I owe everything about who I am today to learning how to circumvent firewalls and other forms of restriction. I would almost certainly be dead if I hadn't learned to socialize and program on the web despite it being strictly forbidden at home. Most of my interests, politics and personality were forged at 2am, as quiet as possible, browsing the web on live discs. I now support myself through those interests.
We're so quick to forget that kids are people, too. And today, they often know how to safely navigate the internet better than their aging caretakers who have allowed editorial "news" and social media to warp their minds.
Even for people who think they're really doing a good thing by supporting these kinds of insane laws that are designed to restrict our 1A rights: the road to hell is paved with good intentions.
This is obviously where it's going to go, at least in the US. Things that are non-religious, non-Christian especially, pro-LGBT, and similar will be disproportionately pulled under "adult content" to ensure that children are not able to be exposed to unapproved ideas during formative years.
That has already been going on for decades, with satanic panic and banning of library books.
1 reply →
Exactly. Having lived through it already, I know what it did to me and I would never wish that upon another child. The internet saved me from being a religious, colonial, racist piece of shit like the rest of my family.
2 replies →