← Back to context

Comment by a2128

21 hours ago

Roblox's introduction of mandatory face verification to chat is one of the most biggest examples of how people in tech can get so deep in trying to create a solid technical solution, that they completely miss the human problems it creates.

You could create the best possible face verification system that processes everything completely locally, uses CPU security features to make sure the photos stay exactly where they're supposed to, etc etc. You could design the best possible chat age segregation system that makes sure nobody can ever get groomed over chat again. You can get so deep that you forget you're forcing children to take pictures of themselves, and fail to consider the wider effects this will have on the safety of those kids in general.

How's Jimmy supposed to know that taking a picture of himself for roblox.com is okay, but taking a picture for somescamwebsite that he found in a Roblox game is absolutely not okay? This solution creates a much worse problem. Sane parenting would tell kids to never take pictures of themselves or put it on any website, but now we're clearly shifting the role of parenting to tech companies and we are going to see bad consequences of this.

Ideally Roblox would be able to rely on the platform to tell them whether the device is child-locked or not. It would be up to parents to make sure their kids only have access to devices with appropriate locks turned on. Parents could rely on vendors to make devices where it’s easy to set appropriate locks, and rely on stores not to sell unlocked devices to kids.

But we don’t live in that world.

Also, the are trying to prevent adults from pretending to be kids, which is much harder than preventing kids from accessing adult sites.

  • This is an interesting comment because there’s a parallel effort to shift age verification to devices, which draws a lot of hate here.

    • Because almost universally it's not an privacy-preserving age verification, but permanently deanonyming identification.

      Please, let's keep it accurate.

This feels like a bit of a reach. It's not really clear that adding face scanning as a blocker for chat makes anyone more likely to fall for scams. These hypotethical god tier engineers should just make scam prevention software in addition to face scanning anyway?

If 11-year-old Jimmy is anything like I was a lifetime ago (in terms of understanding tech), he knows how to ask an LLM to take his picture and make him look like he's 18... and none of it matters anyways.