← Back to context

Comment by stuffn

8 hours ago

> If law enforcement or spy agency asked for a dragnet warrant like "find me all of the people that might be guilty of XYZ" or "find me something this individual might be guilty of"; tech companies have a moral obligation to resist, in the best interest of humanity.

There is more evidence they will do this rather than that they won't. ChatGPT is a giant dragnet and 15 years ago I would've argued it's probably entirely operated and funded by the NSA. The police already can obtain a "geofenced warrant" today. We're not more than one senator up for re-election from having a new law forced down our throat "for the children" that enables them to mine OpenAI data. That is, if they don't already have a Room 641A located in their HQ.

People pour their live out into these fuzzy word predictors. OpenAI is holding a treasure trove of personal data, personality data, and other data that could be used for all kinds of intelligence work.

This is objectively bad regardless of how bad the criminal is. The last near 40 years of history, and especially the post 9/11 world, shows that if we don't stand up for these people the government will tread all over our most fundamental rights in the name of children/security/etc.

Basic rights aren't determined by how "good people" use them. They are entirely determined by how we treat "bad people" under them.

Just wait until AI is advanced enough that you can buy an AI best friend who will be with you all your life. I'm reminded of Decker's AI hologram friend in Blade Runner 2049. The only thing they got wrong was she was not collecting data for the megacorp.

Thinking again, the AI will certainly be "free".