Oh but it will get worse. Legislation to force companies to install survtech in their devices/apps is already being pushed left and right. We are still screaming a little about it, but I think it's a matter of time before it gets normalized and the state goes for the next level, which will be to prosecute individuals who try to evade the surveillance net. The recent case with GrapheneOS[^1], while still far from being an example of it, it is sufficient to inspire some legislators...
That's why we need to get as many people on surveillance-free devices as quickly as possible. 400K users [1] may be easy to ignore or make suspect, 4M is a little harder, 40M is a serious blip on the radar, 400M is a major force (one can dream).
If you do not like surveillance capitalism (which enables government surveillance), get a compatible phone and install GrapheneOS now. Help family and friends get set up tomorrow. Make it a force too large to reckon with before the legislation is there (legislation is somewhat slow, so there is a window of opportunity).
People have been bending over for policies and changes with way more impact to their everyday lives and livelihoods, and they'll rise up for this? That's daydreaming.
I could care less about surveillance on my phone or the internet. I can just stop using the internet for anything besides the necessities. It’s physical, IRL surveillance that is a nightmare. You can’t escape it, there’s no way to opt out and consent doesn’t matter.
Phone surveillance is IRL surveillance. That's because your phone connects to cell towers that exist in the real world, and they can snitch on your precise location in real time.
"I don't care for surveillance on devices/internet because I can always cut off myself from the thing 8 billion people use and has become absolutely essential, and often mandated or strongly pushed, for work, banking, and even government interactions"
The time to resist against these policies and technologies was 2-5 years ago.
Every single person in the US's future, safety, rights and freedom is currently at stake. There is no more time left to wait and see how things play out.
More like 13 years ago, when Snowden revelations made the reach of this public. Nothing was done, and this kept expanding till today state of things. No one should be surprised.
And over the domestic surveillance, that had some complaints back in that time, there is the point of foreign surveillance and intervention, that had no slowdown back then, so you can figure out where that should be today. At least Americans have some saying on their government and policies, but for the rest of the world is just the new normal.
> More like 13 years ago, when Snowden revelations made the reach of this public. Nothing was done, and this kept expanding till today state of things. No one should be surprised.
Yeah, obama was president at the time.
A lot of fanfare and then nothing happened.
People were also being deported by ICE, in larger quantities, but that didn’t even make the news.
It’s always “weird” when the same action get different a connotation depending on who’s president…
When it comes to Flock in particular I’ve been seeing a lot more in terms of resistance and pushback in local Reddit communities. At least in my cities sub I see posts regarding anti-flock messaging or related activities at least once a week now.
Well yeah, YC is a tech incubator plugged pretty deep into the SV hivemind, and the leading figures of it seemed to have decided that fascism is a better alternative to any kind of regulation on their activities.
> time to resist against these policies and technologies was 2-5 years ago
The time to resist the next crop of policies and technologies is today.
And I disagree the ground was more fertile for action in Covid. The silver lining to the AI companies’ PR and political ineptitude is that there is widespread, bipartisan pushback against tech in all stripes.
It's been a lot longer than that. You may have forgotten to include the encroachments that you supported; you seem only to have started with the disaster under Biden.
For example, I never hear about how hard librarians* fought against "National Security Letters" after 9/11. How quaint it is now to imagine that people thought that there should be a fundamental right to be able to read freely and without disclosing what you read to anyone, especially governments?
Technology has only made this cheap to do at scale.
For people who may not be familiar, the government insisted on the right to go into libraries and get a list of the books you've read. Hell, it's basically just a "pen register**," and the culture not only gave up on resisting that this data be considered private, but forgot why anyone would have ever thought that way.
Now we're arguing about forced digital attestation, but we're barely arguing about digital ID anymore ("of course" we need that), or even remember that most people were against federal identification in the US. Federal identification failed at every point to gain any support; it was pushed hard and failed during the Clinton admin, finally passed with everything else of this nature after 9/11, and then it was resisted and ignored enough to force deadlines to be pushed farther and farther back - it's been 30 years of RealID at this point.
There's no evidence that the population ever supported federal ID. The idea was forced upon them, and they just waited a generation for people to forget that the government once didn't even know or care that many people existed. 30 years from now, it will probably be weird trivia that the census was done anonymously: "You mean you didn't have to sign it under penalty of perjury? What would be the point of the data if you didn't know who it belonged to?"
In 5 days, May 27, 2026, you'll have to pay a fee of $45 in order to get on a plane for not having Real ID.
It's so obvious that these claims of necessity are always just excuses for a power grab. British Labour, who spent decades supporting huge amounts of immigration and then calling everyone racist who thought it was too much, now like Trump uses the prevention of illegal immigration as a reason to impose digital ID on everyone. They're xenophobes when it comes to tracking everyone's movements, but xenophiles when they needed to lower wages. Vote Tory, then! Nope, they supported and oversaw every element of all of this. None of this stuff ever sees a ballot.
A surveillance state was always inevitable once wireless networking, GPS, and cameras were ubiquitous. If you say this isn't true, show me anywhere in the world with these technologies that is not headed down this path.
This makes for nice political slogans but Hank Asher had the entire state of Florida DMV records in the early 90s and we did nothing.
Then we did nothing after 9/11 and the patriot act. We did nothing between 9/11 and Snowden.
We did nothing after Snowden. We have literally done nothing in 35 years but now is the time to start?
Snowden is probably in political office in a society that had the will to do something about this.
At a deeper level, I think people would need to care more about the outcomes and higher order effects of political decisions and not just the emotional weight of political slogans. The fundamental problem is that is not the society we live in.
It was really tiny, inexpensive cameras and wireless networks. Cameras are everywhere now. They're so cheap they're almost free, and it doesn't require an expert to install them.
Uh France? It annoys me when people say this stuff is "inevitable." No, many countries have forcibly "reshaped" their government (French revolution, American revolution, etc etc) and nobody has any basis for saying it won't happen again, perhaps many more times.
French Revolution is largely regarded as a tragedy. It led first to the Terror, and after that a series of new monarchies over the following century.
Revolutions in most countries have generally replaced one faction of the ruling class with a competing faction of the ruling class, with little actual change for the people.
Europe is, compared to the US, doing a lot more for protection of private data. That includes strict guardrails on what data can be collected and how it is used.
Secret courts still exist but the phenomenon of random Flock employees spying on children in locker rooms at gyms is so much harder to get away with in a system with a modicum of decency.
Chat control was actually shot down, and that was the UK not Europe (anymore).
Laws are different in different places. The world is not composed of America and other-Americas.
Saying something was shot down isnt that strong of an argument. The US government has proposed and shot down surveillance laws hundreds of times, until one finally passes.
A scene from the Chinese 1980s period drama "Like a Flowing River 2":
Lei Dongbao, party secretary of a small village, is courting the owner of a restaurant in a nearby city. He persuades her to let him care for her young son over the weekend.
As he's heading back to his village on his motorcycle with the boy seated behind him, he drives by some women resting in the shade by the side of the road. One of them remarks to another, "Why does the secretary have a child?"
By the time he arrives at his office, all of his subordinates - and one of their wives - have turned out to meet him and say hello to the child.
> Citizens, on the other hand, don’t like red light cameras because they don’t want to be fined. They complain that the cameras are an invasion of their privacy. I don’t buy that because I grew up in a small town, and as such I understand that privacy is a myth.
What’s the fix? What’s a simple rule change that would, at the very least, take these data out of law enforcement’s hands outside the most-necessary situations?
You may not realize it but this isn't even about law enforcement. It's also about tech companies having the data. What they will do with it, who they will sell or leak it too.
It's about the amount of data. It's about what it can be used for from military adjacent organizations under a fascist regime. Whether you think the us is headed toward fascism or not, what if it did? That's the point.
> this isn't even about law enforcement. It's also about tech companies having the data
One is a clear and present danger. The other is a hypothetical danger. Both deserve being addressed. But if only one is going to get political capital, it should be the first.
(I've worked on technology privacy issues. My takeaway is the public is broadly fine with the tradeoff. Folks in tech are not. But folks in tech with strong views on privacy are politically useless due to a combination of self-defeating laziness and nihilism.)
You may not realize it but this isn't even about law enforcement. It's also about tech companies having the data.
This. The lesson of the past decades is: if some organization has the data, eventually it becomes too attractive not to (ab)use it. Even Apple, which sold itself as a privacy-first company is slowly adding more and more ads. Squeezing out more profits is just too attractive with the pile of data that they are sitting on. Similarly, bad governments will require access to the data if they can.
Employees inside companies should push back collection of data as much as possible (the GDPR helps a lot in Europe). If you do not have the data, you cannot use it in a user hostile-way in the future and governments cannot request data that you do not have. If you have to store data, go for end-to-end encryption.
Citizens should try to escape the Apple/Google duopoly (e.g. by installing GrapheneOS), block trackers, and only install the necessary apps (no app = no easy tracking). For apps that you do need, revoke as many sandbox privileges as possible.
An open source community driven surveillance network that alerts the community when it is accessed by a select list of “trusted” governing officials. Clearly outlined access rules that are policy driven, technically controlled and auditable.
Sure Flock, we buy your safety pitch. We just don’t trust you.
> surveillance network that alerts the community when it is accessed by a select list of “trusted” governing officials
This is the worst of all worlds. Actual criminal investigations get thwarted or the reporting requirement gets diluted to the point of being useless (“someone looked for something today!”). And a burden of vigilance shifted onto the public.
None, because they are above the rules. You need actual enforcement.
Or the other guy's community network idea but it would have to also publish the realtime activities and whereabouts of all politicians who voted against making this illegal.
Much like the law that stopped video rental companies from telling what their customers were renting, that passed after some politicians had their video rental histories leaked.
They’re above the rules for a political cycle because we’re shifting to a system of spoils. That doesn’t change that everything they’re doing right now is legal. (Outside ICE. They’re a warren of criminality right now.)
No "simple rule", I'm afraid. Push money out of politics and aggressively redistribute wealth to curb inequalities, that's the only way to weaken the reactionary and authoritarian ideals currently flourishing. Until then, surveillance is a given.
The straightforward broad brush fix is a US port of the GDPR. Make mass surveillance commercially unlucrative, and most of the data currently available to the government won't be collected in the first place. Furthermore, it's a basic line in the sand that gives individuals an idea that privacy is an actionable right, not just something to powerlessly complain about.
That this culture shift would need time to trickle down into positive bans on surveillance performed by the government (eg Flock), or requiring audit trails for government use of commercial data that still gets collected, shows how far we're behind.
(I use the word "port" to indicate that we need to avoid letting lobbyists stuff it full of loopholes and regulatory capture the way everything else is. Heck I think we could do worse than copying the text verbatim and letting the courts sort it out)
Yeah. I really like the main idea behind GDPR, which is that data containing PII is the property of the person it describes, not of the companies that process the data to provide services.
This means that I, as the owner of my data, can refuse to provide it for some use cases, request its deletion, etc. It’s my data after all.
The older and more jaded I get, the more I think that the only way to fix this mess before we all die of climate change is to dump the entire US government off a cliff and write a new constitution.
> the only way to fix this mess before we all die of climate change is to dump the entire US government off a cliff and write a new constitution
We don't have public consensus on major questions, in my opinion, to make this a fruitful endeavour.
One thing we need is a political movement to push for Constitutional amendments. My five are, in decreasing order of priority, (1) multi-member Congressional districts, (2) striking the pardon power, (3) abolishing the electoral college and creating a referendum requirement for major legislation, (4) changing the first sentence of Article II to "the President shall execute the laws of the United States," and (5) permitting the Congress to charter independent agencies for up to 20 years.
One idea I haven't seen much discussion on is "provably beneficial surveillance" [1], which builds off of Nick Bostrom's vulnerable world hypothesis. It seems like the best path forward.
>We can turn that conventional wisdom on its head, by reframing it as a question: is it possible to do surveillance and consequent policing in a way that is (a) compatible with or enhances liberal values, i.e., improving the welfare of all, except those undermining the common good; and also (b) sufficient to prevent catastrophic threats to society? I call this possibility Provably Beneficial Surveillance. It's a concept expanding on an old tradition of ideas, including search warrants, due process, habeas corpus, and Madisonian separation of powers, all of which help improve the balance of power between institutions and individuals. In particular, all those ideas help enable surveillance in service of safety, while also taking steps to prevent abuses of that power.
Salt Typhoon is the refutation to this. Building and enforcing a "lawful intercept" system formally codifies an exploit chain for your adversaries to use. If you don't want your politicians and dignitaries being blackmailed by foreign opposition, don't even consider this type of system for widespread development.
Let America be the canary in this particularly toxic coal mine, and refuse similar systems wherever you are locally.
Nope. That's not how any of this is trending at all. Being optimistic is good for getting through tough times. Albeit sometimes. It might help people sleep at night but sleeping our way into technofacism won't make it any better for us or our children.
I point to Michael Nielsen's commentary on Vulnerable World Hypothesis [1] again:
>do you think inexpensive, easy-to-follow recipes for building catastrophic technologies will one day be found, given sufficient understanding of science and technology?
With every increase in technology and science, the probability increases, and as a result, society will necessitate ever more surveillance. The reason provably beneficial surveillance is important to discuss is that we need a careful middle path between totalitarianism and outright catastrophe. It is the opposite of "sleeping our way" into technofascism.
"Provably beneficial surveillance" is the wrong framing.
What you're trying to say is that the harms of surveillance are diminished when the underlying power is distributed enough that cops have to justify themselves in order to access the surveillance powers. That's why we have a 4th Amendment that demands cops get warrants before doing searches and seizures. Think of the difference between a store with a security camera that records to a local network DVR, and the same store but they bought some Ring cameras and send it to Amazon's servers. The former is the necessary amount of surveillance to prove a crime happened, the latter is just enabling abuse.
I think it is a new framing that merits discussion.
Example case is the school shooter in Canada that OpenAI knew about but chose not to warn authorities of (presumably because OpenAI wants to balance safety and privacy).
OpenAI (or any other big tech) has extreme concentration of power and knows more about its users than any government authority.
At what point should OpenAI alert authorities?
I would much rather have "provably beneficial surveillance" than OpenAI having an arbitrary black box policy or for government authority to have direct backdoor to all OpenAI data.
All the known history of humans is evidence against the possibility of existence of "beneficial surveillance".
This is a utopian idea of the same kind as the idea of theoretical communism.
The communist theory argued that because the owners of assets can use their power in nefarious ways against the others this can be easily solved by dispossessing them of their assets and transforming all such private assets into assets that belong to the common property owned by all people. Then all assets will be used for the welfare of the entire society.
The fallacy of this theory was that when something belongs to all people it is impossible for all people to manage it directly. So there must be a layer of relatively few middlemen who manage the assets directly.
In all the communist societies, instead of managing the assets for the common good, those middlemen have succeeded to become the de facto owners of the assets, despite not being de jure their owners. And then they managed the assets according to their personal interests, like any capitalist billionaire.
The only difference was that the communist elite was much less secure in their positions than rich capitalists, because not being the legal owners of a company or of other such valuable assets meant that they could lose their privileges at any time if their boss in the communist party hierarchy no longer liked them and sent them to an inferior position.
This hierarchical dependence ensured that the communist elite had to obey more or less whatever the supreme leader ordered. Except for this obedience, there was no real difference between a communist economy and the extreme stage of monopolistic capitalism, despite what the naive theory of communism hoped to achieve by nationalizing everything of value.
Similarly, I see no hope for a theory of "beneficial surveillance". Such beneficial surveillance could exist only if it were controlled by good-willing people. But this will never happen, like in practical communism, some of the worst people will be those who would succeed to control it.
I'm intrigued by Michael Nielsen's thoughts on cryptography applied to synthetic biology risk.
I'll quote his notes on using cryptography to maintain a balance of privacy and safety:
>To help address such concerns, it's been proposed that synthesis screening should use cryptographic ideas to help preserve customer privacy, while still ensuring safety. Let me mention three such ideas, some of which have already been implemented in a prototype system built by the SecureDNA collaboration. The first idea is that the screening itself should be done with an encrypted version of the sequence data, to help preserve customer privacy. The synthesis step would still require the raw sequence data, but such encryption would at least prevent centralized screening services from learning the sequence being synthesized. Second, as mentioned above, screening for exact matches and homologous sequences won't catch everything, especially as de novo design becomes possible. So it's also been proposed that an encrypted form of the sequence data should be logged and kept after synthesis. That data could not routinely be read by the synthesis company or screening service. However, suppose some later event occurs – say, some new pandemic agent is found in the wild. Then it should be possible to check whether that agent matches anything in the encrypted synthesis records. In the event such a check was needed, a third party authority could provide a kind of "search warrant" (a private key of some sort) to decrypt the data, and identify the responsible party. The third idea is to use cryptography to ensure the screening list remains private, and can even be updated privately by trusted third parties, without anyone else learning the contents of the update. Taken together, these three ideas would help preserve the balance of power between customers and the synthesis companies, while contributing to public safety and enabling imaginative new synthesis work to be done.
>Indeed, cryptographers are so clever that they've devised many techniques you might a priori deem impossible, or not even consider at all. Ideas like zero knowledge proofs, homomorphic encryption, and secret sharing are remarkable. As software (and AI) eats the world, cryptography will increasingly define the boundaries of law.
You mentioned communism, and I'll add to that since I've lived under communism. It's a great idea in theory that doesn't work in practice because of human limitations.
It doesn't work, because A) the reason you said: government officials favor themselves, and B) the knowledge problem: the economy is far too complex for a small group of officials to plan what everyone else should be doing.
An interesting idea that emerges now is an AI-moderated socialism. If A) AI can be trusted to not favor itself, and B) AI has perfect knowledge of each human (our needs, what we're good at, etc.), I can imagine an AI-moderated socialism to work.
An ideal future I can imagine is a world with many AI-moderated polities, and humans have freedom to move between them. AI-moderated polities share some global standards on safety, trade, and conflict resolution but otherwise have differing policies so humans have the freedom to find the one that they most prefer.
This is a global phenomenon, not just the US.
It's also accelerating, because precedent in one Commonwealth or EU country spreads very quickly. It feels like lawmaking in general, legislation, regulation is accelerating globally.
Fear sells. Everyone is afraid of getting sued and being denied insurance. The answer is cameras!
Dividing people to vilify each other over race, religion, gender, ethnicity and even politics is incredibly profitable. Once they're afraid of their neighbors, they'll happily pay someone to protect them at every turn.
and soon from space? radio engineering breakdown of starlink radar capabilities, it’s a pretty impressive bird if you were designing it only for that: https://youtu.be/jbp3kdJZ1_A
Psychopathy is a serious disease. Without control it's proven to be the most destructive force human beings have ever faced. We have to keep it in check or we risk everything. These personalities by definition are never satisfied by any level of suffering. They can't feel anything.
I know someone who claims they were experimented on by an agency. Who do you reach out to for help if that was you? Most people don't take you seriously or think you need mental help.
If you want less petty crime, bring back social safety nets. Pay people better.
I'm dead serious.
-
Addendum: People generally don't resort to petty crime for no good reason. They do it because some need is not being met, or they have become socially outcast due to some systemic failure. When people feel they have little autonomy to exist in a meaningful way, and even being poor is expensive and criminalised, of course you'll see petty crime everywhere. Cracking down on the "undesirables" won't make them go away, it'll just make the issue more pronounced.
Well duh. Anyone with half a brain knows that the root causes of crime lie in poverty, systemic exclusion, and the erosion of social safety nets, rather than in inherent criminality or a lack of "moral character."
For individuals left without access to stable, decently paid jobs or a viable social safety net, black markets—eg the drug trade or sex work—become the "employers of last resort".
To truly reduce youth offending and petty crime, the formula is simple: jobs, jobs, and jobs. Most people would gladly choose a stable, decent-paying job over participation in the illicit economy if given the opportunity.
A better economy would help more than surveiling every single persons every moves and all of their communications.
I would literally buy you a bicycle to change your mind. Or sit down and review countries where theft is minimal so we could brainstorm real solutions.
Maybe the U.S. could stop normalizing and modelling blatant criminality as a first step, in lieu of mass warrantless surveillance. Just yesterday, the U.S. president was giving what could be generously construed as a speech, in which he said of U.S. naval activities around the Strait of Hormuz: “We’re taking the cargo. We’re taking the oil. We’re like pirates.”
Your goals are petty and short-sighted. One nice thing about the current state of economics, technology, labor and inflation is that we'll have fewer people who can only imagine suffering to the extent of having a bicycle stolen, and would not give the worst people in the world an infinite amount of power in order to prevent this from happening to them again.
The 20% of the country that thinks that shoplifting is the real problem are a problem. They will always vote for the biggest liar.
I'm right now imagining a counterfactual world where there is no property crime or physical assault, and petty reactionaries are demanding surveillance in order to keep people from swearing.
You don't have control over whether petty reactionaries exist. Model them as non-sentient beings if it helps you analyze it dispassionately. They're going to react to public disorder by voting in pubic safety authoritarians like Bukele or Duterte with or without your permission. Thus everyone should care about shoplifting, the only disagreement is whether you care about the first or second order effects of it.
Also, all property crime is a drop in the bucket compared to white collar crime. The people who are super concerned about petty theft are often the ones stealing massive amounts of money from everyone else and creating the situations that lead to petty theft.
Wage theft (minimum wage violations, forced off the clock work, withheld pay, etc) dwarfs robbery, burglary, and auto theft alone in dollar value. And that's just one kind of white collar crime.
We also have market manipulators, embezzlers, cons selling "wellness" bullshit, companies like Flock and Palantir conspiring to break constitutional amendments, Polymarket grifters, what have you.
I'd be happy with unlimited bike theft if those fucks all ended up in prison, but realistically it would lower the bike theft.
You're free to move to Singapore/South Korea/Japan whenever you want. Your USD (assuming you are one) will go far there, and if you are lucky enough to be white you will get treated like a king/queen there.
As it turns out, society is a lot more fun when there is just a bit of risk of crime. I'll 1000000000% take the additional freedom to do "stupid shit" in the USA over living in one of these boring dystopias.
Oh but it will get worse. Legislation to force companies to install survtech in their devices/apps is already being pushed left and right. We are still screaming a little about it, but I think it's a matter of time before it gets normalized and the state goes for the next level, which will be to prosecute individuals who try to evade the surveillance net. The recent case with GrapheneOS[^1], while still far from being an example of it, it is sufficient to inspire some legislators...
[^1] https://www.androidauthority.com/google-pixel-organized-crim...
That's why we need to get as many people on surveillance-free devices as quickly as possible. 400K users [1] may be easy to ignore or make suspect, 4M is a little harder, 40M is a serious blip on the radar, 400M is a major force (one can dream).
If you do not like surveillance capitalism (which enables government surveillance), get a compatible phone and install GrapheneOS now. Help family and friends get set up tomorrow. Make it a force too large to reckon with before the legislation is there (legislation is somewhat slow, so there is a window of opportunity).
[1] https://x.com/GrapheneOS/status/2047321144601071673
People have been bending over for policies and changes with way more impact to their everyday lives and livelihoods, and they'll rise up for this? That's daydreaming.
2 replies →
I could care less about surveillance on my phone or the internet. I can just stop using the internet for anything besides the necessities. It’s physical, IRL surveillance that is a nightmare. You can’t escape it, there’s no way to opt out and consent doesn’t matter.
Phone surveillance is IRL surveillance. That's because your phone connects to cell towers that exist in the real world, and they can snitch on your precise location in real time.
Consent never mattered btw.
"I don't care for surveillance on devices/internet because I can always cut off myself from the thing 8 billion people use and has become absolutely essential, and often mandated or strongly pushed, for work, banking, and even government interactions"
So you do care somewhat about surveillance on your devices then!
4 replies →
I agree.
Even if you walk by a FLOCK camera you are catalogued.
Sickening bravado of these privacy stealing folk.
[dead]
The time to resist against these policies and technologies was 2-5 years ago.
Every single person in the US's future, safety, rights and freedom is currently at stake. There is no more time left to wait and see how things play out.
More like 13 years ago, when Snowden revelations made the reach of this public. Nothing was done, and this kept expanding till today state of things. No one should be surprised.
And over the domestic surveillance, that had some complaints back in that time, there is the point of foreign surveillance and intervention, that had no slowdown back then, so you can figure out where that should be today. At least Americans have some saying on their government and policies, but for the rest of the world is just the new normal.
I'm old enough to remember AT&T room 641A
> More like 13 years ago, when Snowden revelations made the reach of this public. Nothing was done, and this kept expanding till today state of things. No one should be surprised.
Yeah, obama was president at the time.
A lot of fanfare and then nothing happened.
People were also being deported by ICE, in larger quantities, but that didn’t even make the news.
It’s always “weird” when the same action get different a connotation depending on who’s president…
17 replies →
Flock is a YC company. I don’t think the resistance will be organized on HN in spite of its ostensibly hacker ethos
When it comes to Flock in particular I’ve been seeing a lot more in terms of resistance and pushback in local Reddit communities. At least in my cities sub I see posts regarding anti-flock messaging or related activities at least once a week now.
1 reply →
There are enough normal people here it is still worth trying.
Well yeah, YC is a tech incubator plugged pretty deep into the SV hivemind, and the leading figures of it seemed to have decided that fascism is a better alternative to any kind of regulation on their activities.
[flagged]
13 replies →
> time to resist against these policies and technologies was 2-5 years ago
The time to resist the next crop of policies and technologies is today.
And I disagree the ground was more fertile for action in Covid. The silver lining to the AI companies’ PR and political ineptitude is that there is widespread, bipartisan pushback against tech in all stripes.
The time was 30 years ago. Back then anyone responsible should have been properly dealt with.
And yet people will wait, and things will play out in the worse scenario.
It's been a lot longer than that. You may have forgotten to include the encroachments that you supported; you seem only to have started with the disaster under Biden.
For example, I never hear about how hard librarians* fought against "National Security Letters" after 9/11. How quaint it is now to imagine that people thought that there should be a fundamental right to be able to read freely and without disclosing what you read to anyone, especially governments?
Technology has only made this cheap to do at scale.
For people who may not be familiar, the government insisted on the right to go into libraries and get a list of the books you've read. Hell, it's basically just a "pen register**," and the culture not only gave up on resisting that this data be considered private, but forgot why anyone would have ever thought that way.
Now we're arguing about forced digital attestation, but we're barely arguing about digital ID anymore ("of course" we need that), or even remember that most people were against federal identification in the US. Federal identification failed at every point to gain any support; it was pushed hard and failed during the Clinton admin, finally passed with everything else of this nature after 9/11, and then it was resisted and ignored enough to force deadlines to be pushed farther and farther back - it's been 30 years of RealID at this point.
There's no evidence that the population ever supported federal ID. The idea was forced upon them, and they just waited a generation for people to forget that the government once didn't even know or care that many people existed. 30 years from now, it will probably be weird trivia that the census was done anonymously: "You mean you didn't have to sign it under penalty of perjury? What would be the point of the data if you didn't know who it belonged to?"
In 5 days, May 27, 2026, you'll have to pay a fee of $45 in order to get on a plane for not having Real ID.
It's so obvious that these claims of necessity are always just excuses for a power grab. British Labour, who spent decades supporting huge amounts of immigration and then calling everyone racist who thought it was too much, now like Trump uses the prevention of illegal immigration as a reason to impose digital ID on everyone. They're xenophobes when it comes to tracking everyone's movements, but xenophiles when they needed to lower wages. Vote Tory, then! Nope, they supported and oversaw every element of all of this. None of this stuff ever sees a ballot.
[*] https://www.library.illinois.edu/ala/2024/10/07/15-years-of-...
[**] https://en.wikipedia.org/wiki/Smith_v._Maryland
Enemy of the State came out in 1998, and the capabilities in that movie were not far fetched, just lacking in bandwidth.
A surveillance state was always inevitable once wireless networking, GPS, and cameras were ubiquitous. If you say this isn't true, show me anywhere in the world with these technologies that is not headed down this path.
its inevitable if you do nothing to organize politically against it.
This makes for nice political slogans but Hank Asher had the entire state of Florida DMV records in the early 90s and we did nothing. Then we did nothing after 9/11 and the patriot act. We did nothing between 9/11 and Snowden. We did nothing after Snowden. We have literally done nothing in 35 years but now is the time to start? Snowden is probably in political office in a society that had the will to do something about this.
At a deeper level, I think people would need to care more about the outcomes and higher order effects of political decisions and not just the emotional weight of political slogans. The fundamental problem is that is not the society we live in.
Many other reasons to do it too.
>"if you do nothing to organize politically against it"
how does one politically organize against a billion dollar industry which is friends with, and donates to, the ruling class?
they do whatever they want and we just post about it online and click 'like' or post emojis.
1 reply →
It was really tiny, inexpensive cameras and wireless networks. Cameras are everywhere now. They're so cheap they're almost free, and it doesn't require an expert to install them.
It’s inevitable that some country would do it, but not inevitable that any given nation would do so, except maybe the CCP.
Uh France? It annoys me when people say this stuff is "inevitable." No, many countries have forcibly "reshaped" their government (French revolution, American revolution, etc etc) and nobody has any basis for saying it won't happen again, perhaps many more times.
French Revolution is largely regarded as a tragedy. It led first to the Terror, and after that a series of new monarchies over the following century.
Revolutions in most countries have generally replaced one faction of the ruling class with a competing faction of the ruling class, with little actual change for the people.
> It annoys me when people say this stuff is "inevitable."
"Resistance is futile" is an old slogan of them Borgs.
Europe is, compared to the US, doing a lot more for protection of private data. That includes strict guardrails on what data can be collected and how it is used.
Secret courts still exist but the phenomenon of random Flock employees spying on children in locker rooms at gyms is so much harder to get away with in a system with a modicum of decency.
Chat control was actually shot down, and that was the UK not Europe (anymore).
Laws are different in different places. The world is not composed of America and other-Americas.
Saying something was shot down isnt that strong of an argument. The US government has proposed and shot down surveillance laws hundreds of times, until one finally passes.
2 replies →
Chat Control was proposed and rejected in the European Union
1 reply →
A scene from the Chinese 1980s period drama "Like a Flowing River 2":
Lei Dongbao, party secretary of a small village, is courting the owner of a restaurant in a nearby city. He persuades her to let him care for her young son over the weekend.
As he's heading back to his village on his motorcycle with the boy seated behind him, he drives by some women resting in the shade by the side of the road. One of them remarks to another, "Why does the secretary have a child?"
By the time he arrives at his office, all of his subordinates - and one of their wives - have turned out to meet him and say hello to the child.
https://www.basicinstructions.net/basic-instructions/2019/9/...
> Citizens, on the other hand, don’t like red light cameras because they don’t want to be fined. They complain that the cameras are an invasion of their privacy. I don’t buy that because I grew up in a small town, and as such I understand that privacy is a myth.
What’s the fix? What’s a simple rule change that would, at the very least, take these data out of law enforcement’s hands outside the most-necessary situations?
You may not realize it but this isn't even about law enforcement. It's also about tech companies having the data. What they will do with it, who they will sell or leak it too.
It's about the amount of data. It's about what it can be used for from military adjacent organizations under a fascist regime. Whether you think the us is headed toward fascism or not, what if it did? That's the point.
> this isn't even about law enforcement. It's also about tech companies having the data
One is a clear and present danger. The other is a hypothetical danger. Both deserve being addressed. But if only one is going to get political capital, it should be the first.
(I've worked on technology privacy issues. My takeaway is the public is broadly fine with the tradeoff. Folks in tech are not. But folks in tech with strong views on privacy are politically useless due to a combination of self-defeating laziness and nihilism.)
2 replies →
You may not realize it but this isn't even about law enforcement. It's also about tech companies having the data.
This. The lesson of the past decades is: if some organization has the data, eventually it becomes too attractive not to (ab)use it. Even Apple, which sold itself as a privacy-first company is slowly adding more and more ads. Squeezing out more profits is just too attractive with the pile of data that they are sitting on. Similarly, bad governments will require access to the data if they can.
Employees inside companies should push back collection of data as much as possible (the GDPR helps a lot in Europe). If you do not have the data, you cannot use it in a user hostile-way in the future and governments cannot request data that you do not have. If you have to store data, go for end-to-end encryption.
Citizens should try to escape the Apple/Google duopoly (e.g. by installing GrapheneOS), block trackers, and only install the necessary apps (no app = no easy tracking). For apps that you do need, revoke as many sandbox privileges as possible.
An open source community driven surveillance network that alerts the community when it is accessed by a select list of “trusted” governing officials. Clearly outlined access rules that are policy driven, technically controlled and auditable.
Sure Flock, we buy your safety pitch. We just don’t trust you.
> surveillance network that alerts the community when it is accessed by a select list of “trusted” governing officials
This is the worst of all worlds. Actual criminal investigations get thwarted or the reporting requirement gets diluted to the point of being useless (“someone looked for something today!”). And a burden of vigilance shifted onto the public.
2 replies →
None, because they are above the rules. You need actual enforcement.
Or the other guy's community network idea but it would have to also publish the realtime activities and whereabouts of all politicians who voted against making this illegal.
Much like the law that stopped video rental companies from telling what their customers were renting, that passed after some politicians had their video rental histories leaked.
> they are above the rules
They’re above the rules for a political cycle because we’re shifting to a system of spoils. That doesn’t change that everything they’re doing right now is legal. (Outside ICE. They’re a warren of criminality right now.)
No "simple rule", I'm afraid. Push money out of politics and aggressively redistribute wealth to curb inequalities, that's the only way to weaken the reactionary and authoritarian ideals currently flourishing. Until then, surveillance is a given.
The straightforward broad brush fix is a US port of the GDPR. Make mass surveillance commercially unlucrative, and most of the data currently available to the government won't be collected in the first place. Furthermore, it's a basic line in the sand that gives individuals an idea that privacy is an actionable right, not just something to powerlessly complain about.
That this culture shift would need time to trickle down into positive bans on surveillance performed by the government (eg Flock), or requiring audit trails for government use of commercial data that still gets collected, shows how far we're behind.
(I use the word "port" to indicate that we need to avoid letting lobbyists stuff it full of loopholes and regulatory capture the way everything else is. Heck I think we could do worse than copying the text verbatim and letting the courts sort it out)
Yeah. I really like the main idea behind GDPR, which is that data containing PII is the property of the person it describes, not of the companies that process the data to provide services.
This means that I, as the owner of my data, can refuse to provide it for some use cases, request its deletion, etc. It’s my data after all.
The older and more jaded I get, the more I think that the only way to fix this mess before we all die of climate change is to dump the entire US government off a cliff and write a new constitution.
As the founding fathers intended.
> the only way to fix this mess before we all die of climate change is to dump the entire US government off a cliff and write a new constitution
We don't have public consensus on major questions, in my opinion, to make this a fruitful endeavour.
One thing we need is a political movement to push for Constitutional amendments. My five are, in decreasing order of priority, (1) multi-member Congressional districts, (2) striking the pardon power, (3) abolishing the electoral college and creating a referendum requirement for major legislation, (4) changing the first sentence of Article II to "the President shall execute the laws of the United States," and (5) permitting the Congress to charter independent agencies for up to 20 years.
1 reply →
One idea I haven't seen much discussion on is "provably beneficial surveillance" [1], which builds off of Nick Bostrom's vulnerable world hypothesis. It seems like the best path forward.
>We can turn that conventional wisdom on its head, by reframing it as a question: is it possible to do surveillance and consequent policing in a way that is (a) compatible with or enhances liberal values, i.e., improving the welfare of all, except those undermining the common good; and also (b) sufficient to prevent catastrophic threats to society? I call this possibility Provably Beneficial Surveillance. It's a concept expanding on an old tradition of ideas, including search warrants, due process, habeas corpus, and Madisonian separation of powers, all of which help improve the balance of power between institutions and individuals. In particular, all those ideas help enable surveillance in service of safety, while also taking steps to prevent abuses of that power.
1. https://michaelnotebook.com/optimism/index.html
Salt Typhoon is the refutation to this. Building and enforcing a "lawful intercept" system formally codifies an exploit chain for your adversaries to use. If you don't want your politicians and dignitaries being blackmailed by foreign opposition, don't even consider this type of system for widespread development.
Let America be the canary in this particularly toxic coal mine, and refuse similar systems wherever you are locally.
No discussion because it's a bad idea
Try a little harder. You got this
Nope. That's not how any of this is trending at all. Being optimistic is good for getting through tough times. Albeit sometimes. It might help people sleep at night but sleeping our way into technofacism won't make it any better for us or our children.
Did you have a better path forward?
I point to Michael Nielsen's commentary on Vulnerable World Hypothesis [1] again:
>do you think inexpensive, easy-to-follow recipes for building catastrophic technologies will one day be found, given sufficient understanding of science and technology?
With every increase in technology and science, the probability increases, and as a result, society will necessitate ever more surveillance. The reason provably beneficial surveillance is important to discuss is that we need a careful middle path between totalitarianism and outright catastrophe. It is the opposite of "sleeping our way" into technofascism.
1. https://michaelnotebook.com/vwh/index.html
1 reply →
"Provably beneficial surveillance" is the wrong framing.
What you're trying to say is that the harms of surveillance are diminished when the underlying power is distributed enough that cops have to justify themselves in order to access the surveillance powers. That's why we have a 4th Amendment that demands cops get warrants before doing searches and seizures. Think of the difference between a store with a security camera that records to a local network DVR, and the same store but they bought some Ring cameras and send it to Amazon's servers. The former is the necessary amount of surveillance to prove a crime happened, the latter is just enabling abuse.
I think it is a new framing that merits discussion.
Example case is the school shooter in Canada that OpenAI knew about but chose not to warn authorities of (presumably because OpenAI wants to balance safety and privacy).
OpenAI (or any other big tech) has extreme concentration of power and knows more about its users than any government authority.
At what point should OpenAI alert authorities?
I would much rather have "provably beneficial surveillance" than OpenAI having an arbitrary black box policy or for government authority to have direct backdoor to all OpenAI data.
All the known history of humans is evidence against the possibility of existence of "beneficial surveillance".
This is a utopian idea of the same kind as the idea of theoretical communism.
The communist theory argued that because the owners of assets can use their power in nefarious ways against the others this can be easily solved by dispossessing them of their assets and transforming all such private assets into assets that belong to the common property owned by all people. Then all assets will be used for the welfare of the entire society.
The fallacy of this theory was that when something belongs to all people it is impossible for all people to manage it directly. So there must be a layer of relatively few middlemen who manage the assets directly.
In all the communist societies, instead of managing the assets for the common good, those middlemen have succeeded to become the de facto owners of the assets, despite not being de jure their owners. And then they managed the assets according to their personal interests, like any capitalist billionaire.
The only difference was that the communist elite was much less secure in their positions than rich capitalists, because not being the legal owners of a company or of other such valuable assets meant that they could lose their privileges at any time if their boss in the communist party hierarchy no longer liked them and sent them to an inferior position.
This hierarchical dependence ensured that the communist elite had to obey more or less whatever the supreme leader ordered. Except for this obedience, there was no real difference between a communist economy and the extreme stage of monopolistic capitalism, despite what the naive theory of communism hoped to achieve by nationalizing everything of value.
Similarly, I see no hope for a theory of "beneficial surveillance". Such beneficial surveillance could exist only if it were controlled by good-willing people. But this will never happen, like in practical communism, some of the worst people will be those who would succeed to control it.
I'm intrigued by Michael Nielsen's thoughts on cryptography applied to synthetic biology risk.
I'll quote his notes on using cryptography to maintain a balance of privacy and safety:
>To help address such concerns, it's been proposed that synthesis screening should use cryptographic ideas to help preserve customer privacy, while still ensuring safety. Let me mention three such ideas, some of which have already been implemented in a prototype system built by the SecureDNA collaboration. The first idea is that the screening itself should be done with an encrypted version of the sequence data, to help preserve customer privacy. The synthesis step would still require the raw sequence data, but such encryption would at least prevent centralized screening services from learning the sequence being synthesized. Second, as mentioned above, screening for exact matches and homologous sequences won't catch everything, especially as de novo design becomes possible. So it's also been proposed that an encrypted form of the sequence data should be logged and kept after synthesis. That data could not routinely be read by the synthesis company or screening service. However, suppose some later event occurs – say, some new pandemic agent is found in the wild. Then it should be possible to check whether that agent matches anything in the encrypted synthesis records. In the event such a check was needed, a third party authority could provide a kind of "search warrant" (a private key of some sort) to decrypt the data, and identify the responsible party. The third idea is to use cryptography to ensure the screening list remains private, and can even be updated privately by trusted third parties, without anyone else learning the contents of the update. Taken together, these three ideas would help preserve the balance of power between customers and the synthesis companies, while contributing to public safety and enabling imaginative new synthesis work to be done.
>Indeed, cryptographers are so clever that they've devised many techniques you might a priori deem impossible, or not even consider at all. Ideas like zero knowledge proofs, homomorphic encryption, and secret sharing are remarkable. As software (and AI) eats the world, cryptography will increasingly define the boundaries of law.
You mentioned communism, and I'll add to that since I've lived under communism. It's a great idea in theory that doesn't work in practice because of human limitations.
It doesn't work, because A) the reason you said: government officials favor themselves, and B) the knowledge problem: the economy is far too complex for a small group of officials to plan what everyone else should be doing.
An interesting idea that emerges now is an AI-moderated socialism. If A) AI can be trusted to not favor itself, and B) AI has perfect knowledge of each human (our needs, what we're good at, etc.), I can imagine an AI-moderated socialism to work.
An ideal future I can imagine is a world with many AI-moderated polities, and humans have freedom to move between them. AI-moderated polities share some global standards on safety, trade, and conflict resolution but otherwise have differing policies so humans have the freedom to find the one that they most prefer.
Text-only, no Javascript, HTTPS optional:
https://assets.msn.com/content/view/v2/Detail/en-in/AA22egkH...
This is a global phenomenon, not just the US. It's also accelerating, because precedent in one Commonwealth or EU country spreads very quickly. It feels like lawmaking in general, legislation, regulation is accelerating globally.
Fear sells. Everyone is afraid of getting sued and being denied insurance. The answer is cameras!
Dividing people to vilify each other over race, religion, gender, ethnicity and even politics is incredibly profitable. Once they're afraid of their neighbors, they'll happily pay someone to protect them at every turn.
Let's just stop with the illegal data mining of Americans, okay?
Somehow they seems to miss the criminals at the top...
and soon from space? radio engineering breakdown of starlink radar capabilities, it’s a pretty impressive bird if you were designing it only for that: https://youtu.be/jbp3kdJZ1_A
you know it's bad when even ol' Rupert is worried
Land of the fee.
Age verification is part of this. Submit your IDs to use AI. Now they know all about you. All done for “safety” but we know that’s an excuse.
https://reclaimthenet.org/senate-panel-backs-guard-act-ai-ag...
So is AI and the push for more data centers than the country can afford or supply power too.
And the ever increasing desire to break encryption.
And the increase in technology companies who have metadata about us citizens becoming offense and defense contractors.
And... The list is so long.
They love control almost in a fetishistic way. It gets them off.
Psychopathy is a serious disease. Without control it's proven to be the most destructive force human beings have ever faced. We have to keep it in check or we risk everything. These personalities by definition are never satisfied by any level of suffering. They can't feel anything.
I know someone who claims they were experimented on by an agency. Who do you reach out to for help if that was you? Most people don't take you seriously or think you need mental help.
Now you are beginning to feel the pain?
[dead]
[dead]
Hopefully this will translate into less petty crime- most theft now goes unpunished. I want to live in a society where bikes aren't stolen
If you want less petty crime, bring back social safety nets. Pay people better.
I'm dead serious.
- Addendum: People generally don't resort to petty crime for no good reason. They do it because some need is not being met, or they have become socially outcast due to some systemic failure. When people feel they have little autonomy to exist in a meaningful way, and even being poor is expensive and criminalised, of course you'll see petty crime everywhere. Cracking down on the "undesirables" won't make them go away, it'll just make the issue more pronounced.
If this had been true then USSR would have been crime free.
> I'm dead serious.
Well duh. Anyone with half a brain knows that the root causes of crime lie in poverty, systemic exclusion, and the erosion of social safety nets, rather than in inherent criminality or a lack of "moral character."
For individuals left without access to stable, decently paid jobs or a viable social safety net, black markets—eg the drug trade or sex work—become the "employers of last resort".
To truly reduce youth offending and petty crime, the formula is simple: jobs, jobs, and jobs. Most people would gladly choose a stable, decent-paying job over participation in the illicit economy if given the opportunity.
[flagged]
Surely you realize that police states exist to protect the ones on top, and have no incentive to give a shit about the ones on the bottom.
A better economy would help more than surveiling every single persons every moves and all of their communications.
I would literally buy you a bicycle to change your mind. Or sit down and review countries where theft is minimal so we could brainstorm real solutions.
[dead]
1984 was not an instruction manual.
Doubtful, it's never really deemed worth LEOs time to pursue bike thieves.
Then we need to make it work their time by bringing back broken windows policing.
4 replies →
Maybe the U.S. could stop normalizing and modelling blatant criminality as a first step, in lieu of mass warrantless surveillance. Just yesterday, the U.S. president was giving what could be generously construed as a speech, in which he said of U.S. naval activities around the Strait of Hormuz: “We’re taking the cargo. We’re taking the oil. We’re like pirates.”
Wait until you get targeted politically.
Your goals are petty and short-sighted. One nice thing about the current state of economics, technology, labor and inflation is that we'll have fewer people who can only imagine suffering to the extent of having a bicycle stolen, and would not give the worst people in the world an infinite amount of power in order to prevent this from happening to them again.
The 20% of the country that thinks that shoplifting is the real problem are a problem. They will always vote for the biggest liar.
I'm right now imagining a counterfactual world where there is no property crime or physical assault, and petty reactionaries are demanding surveillance in order to keep people from swearing.
You don't have control over whether petty reactionaries exist. Model them as non-sentient beings if it helps you analyze it dispassionately. They're going to react to public disorder by voting in pubic safety authoritarians like Bukele or Duterte with or without your permission. Thus everyone should care about shoplifting, the only disagreement is whether you care about the first or second order effects of it.
1 reply →
Also, all property crime is a drop in the bucket compared to white collar crime. The people who are super concerned about petty theft are often the ones stealing massive amounts of money from everyone else and creating the situations that lead to petty theft.
Wage theft (minimum wage violations, forced off the clock work, withheld pay, etc) dwarfs robbery, burglary, and auto theft alone in dollar value. And that's just one kind of white collar crime.
We also have market manipulators, embezzlers, cons selling "wellness" bullshit, companies like Flock and Palantir conspiring to break constitutional amendments, Polymarket grifters, what have you.
I'd be happy with unlimited bike theft if those fucks all ended up in prison, but realistically it would lower the bike theft.
You're free to move to Singapore/South Korea/Japan whenever you want. Your USD (assuming you are one) will go far there, and if you are lucky enough to be white you will get treated like a king/queen there.
As it turns out, society is a lot more fun when there is just a bit of risk of crime. I'll 1000000000% take the additional freedom to do "stupid shit" in the USA over living in one of these boring dystopias.
If it's a choice between stealing a bike and homelessness, I'll steal a bike. So the problem is the threat of homelessness. Right?
> If it's a choice between stealing a bike and homelessness
This is a vanishingly-rare hypothetical in America. (Stealing food? Sure. A bike? No.)
1 reply →