Comment by timmg

1 month ago

> Keep in mind that this case is about about a minor, not an adult.

This obviously means that tech is going to have no choice but to do "age verification". And I don't think there's much of a way to do that that wouldn't be uncomfortable for a lot of us.

I would prefer Meta make their products less addictive for children, with the side-effect that they're perhaps less stimulating for adults, than for Meta to keep their products the way they are, gatekept behind a system that allows them access to even more of my personal data.

I understand why they would want the opposite. They can f*ck right off.

Oh, corporstions pushed age verification, so of course they will not have any choice now. But before that they could just stop being addictive regardless of age.

There are ways, like double blind age verification, in which neither the website knows anything other than "yes, >18", nor the verificator knows anything other than "I was asked if user X is >18, checked, yes". Website doesn't know actual age, verificator doesn't know which website it is or for what action was the request performed.

In fact it's even in the EU Commission's official guidance on how it should be done : https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ:C... (point 46).

Or assign responsibility to…parents and legal guardians…who are not children.

  • Meta is not blameless here. Responsibility can be shared when Meta (and others) are essentially preying on children. It’s an uphill battle for parents by Meta’s design.

  • Sure, parents do bear some responsibility here too. But we are talking about a platform that is engineered to be addictive to adults too. So it’s not as if the platform isn’t still predatory even if we find a way to parent every child on the internet.

  • It would work if parents had legal course to seek justice against corporations that stalk, groom, and manipulate their children against their wishes.