Comment by fixermark

8 years ago

Should they? The vast quantity of users find it incredibly useful and have no reason to be concerned about governments or third parties being able to determine their geographic location, because governments or third parties don't generally care.

>have no reason to be concerned about governments ...

Many aren't, but everyone has reason to.

Governments change. Telling your government your religion in 1920s Germany was harmless, in 1940 many would have preferred if the government didn't have their religion on file.

Circumstances change to. In 1920 being a Japanese in the US wasn't special. After Perl Harbor came the internment camps.

And then there's the mundane stuff. You protest a government policy, someone in the government takes issue and tries to put some of these annoying people in jail.

Given that you don't know when you might become an enemy of the state it's always a good idea to keep the power of the state over its citizens in check.

You can be upset about an aspect of a product, and seek to change that aspect, without abandoning use of the product. For example, 1.3 million people are killed by cars every year, and while we recognize the risk, we also constantly improve them through safety regulations, training and improved technology. Just because people use cell phones and apps today doesn't mean we're okay with the downsides and should stop trying to improving them.

  • It's an interesting example you've chosen, since one of the dimensions along which car safety improvement is being researched is ubiquitous GPS signalling to share data about road and traffic conditions (and since every self-driving car is basically a panopticon and recording device rolled into one).

Mass surveillance is not really for investigating individuals.

The game being played is not '1984', it is 'Foundation'.

It is for steering entire societies, and this works far better on the boring people who think they have nothing to hide as they are the easiest to model

  • I agree the greater emphasis is Foundation-style analysis, but really, it's for both.

    • I've been working a theory that what we are seeing in the last 10 years or so is the escape of these techniques from government into private industry.

      With a single powerful player, you get a consistent, but slightly false narrative. If you have lots of players though, you get multiple competing narratives and the news stops making sense.

      Is partly why I still think Gibson is one of the people who got it closest to the mark.

Cambridge Analytica did far more with far less.

  • Did they? They're sales pitch claimed they could but what we've heard of actual methods and impact didn't appear more effective than regular FB ads.

It's not about being able to track everybody. You're right, nobody cares about that.

It's about being able to track anybody.

Any source for this claim?

  • The general public and repeatedly-reported-upon understanding of how data collection can be leveraged to find unexpected insights not obvious from the data, coupled with the Snowden leaks, coupled with the ever-increasing user count for cellphones, Facebook, Twitter, and the Internet in general.

    If people were deeply individually concerned about the risks vs. rewards of these technologies, they'd stop using them. That's the rubber-meets-the-road calculus I see.

    • Do you trust the public is informed about these technologies? I think you might be overestimating individuals... most folks still don't know about Cambridge Analytica.

      1 reply →

    • > "If people were deeply individually concerned about the risks vs. rewards of these technologies, they'd stop using them."

      Why do you think that? It clearly doesn't apply to stuff like oil, for instance.

      I could give up my phone, but I would be in deep shit if I did it tomorrow. It would take a lot of arrangement to do so and it would piss off my family and lose me work.

      2 replies →

    • Kind of like how automobiles are a luxury, and if people cared about the 4th Amendment they just wouldn't drive anywhere. Nevermind that our way of life is literally not possible without the technologies in question.

      Every single one of the revelations you've mentioned was met with public backlash, followed by either a misinformation campaign or intense dog-wagging. This is called manufactured consent. For example, let's look at Cambridge Analytica. When it was revealed that a military contractor was hired to subvert the 2016 Presidential election, the dominant story in the alphabet-soup media was a twitter tantrum from Trump. As it became clear over the next few days that the story wasn't going to be buried easily, the narrative was quickly shifted away from the subversion of democracy to blaming Facebook for leaking user data, culminating in parading The Zuck before Congress. He played his part perfectly: no bread, but enough circus to keep the masses from thinking too hard about what it means for an election to be free.

      3 replies →

Several recent HN stories have had this kind of comment (first noticed with the Securus submission) that's a weird mix of "You have nothing to fear if you have nothing to hide" and "They will never come for you, you're too unimportant." Is this a sustained campaign or just a way for folks who have contributed to these issues to feel good about themselves?

  • > Is this a sustained campaign

    This breaks the site guidelines. Could you please read and follow them when commenting here? https://news.ycombinator.com/newsguidelines.html

    Insinuations of astroturfing or shilling without evidence (an opposing view does not count as evidence) are an internet toxin that turns out to be worse than the things it insinuates, because it's so widespread. I've written a ton about why we don't allow that here, if anyone wants to read more: https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme...

  • It's just how a lot of people feel about the issue.

    I'm not sure why you would jump to concluding that it's a sustained campaign or some kind of reaction to guilt.

  • Wilsonnb hit the nail on the head, it’s just how some people feel. Though I don’t doubt that some people involved in the creation of this phenomenon use the argument to justify their work.

    I had a hard time understanding why people wouldn’t be more conscientious of their privacy, until I had discussions about the issue with people close to me.

    My folks had a very similar sentiment to the typical “if you have nothing to hide, then why do you worry about it”. My girlfriend had the same thought, but took it a step further and asked why I cared so much about people uninvolved in my life knowing personal details about it, then said I was “the most paranoid person [she’d] ever met”

    Once the Cambridge Analytica scandal broke, they all understood my point. I think the majority of people who don’t work in tech don’t understand the massive implications that our lack of privacy has. They don’t know how cookies or backends or tracking pixels work, and may not even know they exist. They imagine an NSA agent sitting in a room looking for keywords, not companies that they entrust their digital lives to selling off every little piece of info about them. It’s so much more than your Facebook or Twitter posts being public, it’s data that we might not even know about ourselves being kept in the hands of unknown entities.

    To sum up this rant, some people have to see it to believe it because this is outside their scope of knowledge

    • I'm surprised you've had conversations with tech laymen that understand what Cambridge Analytica is guilty of. Everyone I talk to, even reasonably tech-literate people, still don't understand the repercussions. I even point out the possibility of throwing a presidential election, and my mother said, "so what, isn't that just people pushing for the guy they want?"

If they "don't generally care", they wouldn't be collecting that data to begin with.

  • It's possible that they care about the aggregated data and not about the individual data.

  • They collect the data because they can find themselves needing to care in the future, at which point nobody wants to be kicking themselves for failing to collect the data.

1) Users get no benefit from information resale. 2) COINTELPRO

  • Keep in mind: most users are not part of a domestic political organization targeted by the FBI, so again, when the rubber hits the road, they'd rather not be inconvenienced for a risk that applies to other people. They don't care about COINTELPRO (disregarding, of course, the percentage of the population that actually thinks the FBI digging into "subversive" groups is part of its job).

    Users get no benefit from the information resale directly, but they also aren't generally harmed by it. And the benefit they get from having a ubiquitously-connected device in their pocket outweighs the (apparently calculated to be low) per-person cost to their information being resold. The fact that you or I may do the calculus differently for ourselves (because we have different risk sensitivity) doesn't impact those who don't reach the same conclusions.

    • What I'd say is that until somewhat recently, I was interested in politics but not engaged. I took your position during that part of my life. Now that I'm actually engaging in political activities, COINTELPRO and its current incarnations scare the bejesus out of me, and I'm not doing anything that radical, just left of the Democratic Party. YMMV.

      There may come a time in your life when you wish to have a say in the political system or are wronged by a powerful corporation. You'd care in that case. When your political rights disappear, they aren't easy to get back.

      7 replies →

    • A potential victim's ignorance of their risk doesn't mean they aren't at risk.

      Because I'm not specifically aware there's a cross-town bus with my name on it, I'm somehow not about to get pancaked?