Comment by fc417fc802
3 days ago
> ineffective non-sequiturs like requiring them to ID everyone.
Is that really a non-sequitur though? Cigarettes are harmful and addictive so their sale is age gated. So too for alcohol. Gambling? Also yes. So wouldn't age gating social media be entirely consistent in that case?
Not that I'm necessarily in favor of it. I agree that various other regulations, particularly interoperability, would likely address at least some of the underlying concerns. But then I think it might not be such a bad idea to have all of the above rather than one or the other.
If I went to the store and asked for a pack of cigarettes, I show my ID (well, I would if I was carded, but I'm no longer carded :)) and the clerk looks at it, maybe scans it, then takes my money.
If I try to go to an adult website, or even just a discord server with adult content, I need to upload my ID. And now there's numerous third parties who now are looking at my ID, and I have no idea if I can trust them with my info. Indeed, I probably can't, given how many of them have already been breached.
Of all the people, PornHub actually has a pretty good write-up on this (1) (2), and they refer to "device-based" age verification, where you verify your identity once to say, Google or whoever. Then your device proves your age. Fewer middlemen. One source of truth.
I am not against age verification. I am against the surveillance state.
(1) https://www.pornhub.com/blog/age-verification-in-the-news
(2) https://www.xbiz.com/news/281228/opinion-why-device-based-ag...
> they refer to "device-based" age verification, where you verify your identity once to say, Google or whoever. Then your device proves your age. Fewer middlemen. One source of truth.
This is still an absurdity. You don't need the device to prove the age of the user to the service, you need the service to provide the age restriction of the content to the device. Then the device knows if the user is an adult or a kid and thereby knows whether to display the content, and you don't need Google to know that.
Major porn sites already send an RTA header. Social media could be required to do similar. However I think part of the concern here is that many parents don't bother to restrict things. So the question is if we want filtering similar to alcohol where minors aren't permitted to possess it, or similar to porn where the decision is left up to parents.
2 replies →
IRL
> If I went to the store and asked for a pack of cigarettes
online
> and I have no idea if I can trust them with my info
Why did you trust how your ID was scanned (if carded)?
With security cameras present, where did that scanned data end up?
nit: the Discord ID verification hasn't rolled out yet has it?
No, I believe it's next month though.
> Is that really a non-sequitur though?
You have something (human communication) which is not intrinsically harmful -- indeed it is intrinsically necessary -- but has been made harmful on purpose. That is very much unlike those other things, where the harm is in their very nature and isn't prevented by the provider just not being a schmuck on purpose.
That makes age gating a farce, because kids need to be able to communicate with other people, but you would end up in one of these scenarios, each of which is inane: 1) Providers all put up age restrictions and meaningfully enforce them and then teenagers are totally prohibited from communicating over the internet. 2) Providers all put up "age restrictions" which teenagers bypass in ten seconds and the whole thing is a pointless fraud. 3) You try to separate places for kids from places for adults, but then either a) Adults prefer adult spaces where they're not censored, so they congregate there and those spaces get the network effect, and then teens have to sneak in even if they're not looking for adult content because that's where the bulk of all content is, or b) Nobody likes to show ID even if they're an adult so adults congregate in the least restrictively moderated space where they don't have to show ID, and that space gets the network effect. Then to the extent that they censor, they're censoring the adults which is the thing that wasn't supposed to happen, and to the extent that they don't censor, you have a "kid space" that contains adult content.
It's a trash fire specifically because there's a network effect, which is an aggregating force causing adults and kids to be in the same space so they can communicate with each other. Then the space with the network effect would either have to censor the adults even though they can't leave because of the network effect, or not censor the adults and then have adult content in the space the kids have to be because of the network effect.
The way you fix this is not by trying to separate the kids ad adults into separate networks, it's by tagging specific content so the client device can choose not to display adult content if they're a kid. Which also solves the privacy issue because you don't have to provide any ID to the service when the choice of what content to display happens on the client and the service is only tasked with identifying the content.