← Back to context

Comment by ipsum2

10 hours ago

Niantic made 700 million dollars last year, mostly selling virtual game items.

But some numbers pusher somewhere saw an opportunity to make even more money and write good quarter number, padding themselves on the shoulder for a job welll done, without ever wasting a thought about any such unimportant thing as ethical implications...

Why would anyone think niantic would protect user-data from profit?

  • Sarcastically, no one should.

    Unsarcastically, a lot of people believe user data belongs to users, and that they should have a say in how it's used. Here, I think the point is that Niantic decided they could use the data this way and weren't transparent about it until it was already done. I'm sure I would be in the minority, but I would never have played - or never have done certain things like the research tasks - had I known I was training an AI model.

    I'm sure the Po:Go EULA that no one reads has blanket grants saying "you agree that we can do whatever we want," so I can't complain too hard, but still disappointed I spent any time in that game.

    • > Unsarcastically, a lot of people believe user data belongs to users, and that they should have a say in how it's used

      I can understand that people believe this, but why do they do? Nothing in our society operates in a way that might imply this.

      19 replies →

    • >>>>> I have been tricked into working to contribute training data so that they can profit off my labor.

      > Unsarcastically, a lot of people believe user data belongs to users, and that they should have a say in how it's used.

      At some point this stops being a fair complaint, though. Most of the AI-related cases IMO are such.

      To put it bluntly: expecting to be compensated for anything that can be framed as one's labor is such an extreme level of greed that even Scrooge McDuck would be ashamed of. In fact, trying to capture all value one generates, is at the root of most if not all underhanded or downright immoral business practices in companies both large and small.

      The way society works best, is when people stop trying to catch all the value they generate. That surplus is what others can use to contribute to the whole, and then you can use some of their uncaptured value, and so on. That's how symbiotic relationships form; that's how ecosystems work.

      > I'm sure I would be in the minority, but I would never have played - or never have done certain things like the research tasks - had I known I was training an AI model.

      I have a feeling you wouldn't be in minority here, at least not among people with any kind of view on this.

      Still, with AI stuff, anyone's fair share is $0, because that's how much anyone's data is worth on the margin.

      It's also deeply ironic that nobody cares when people's data is being used to screw them over directly - such as profiling or targeting ads; but the moment someone figures out how to monetize this data in a way that doesn't screw over the source, suddenly everyone is up in arms, because they aren't getting their "fair share".

  • I'm not a fan of the way you moved the goal posts here. You argued that Niantic would obviously use user data to fund game operations. Then we see that they don't actually need to do that, and that the game could fund itself. Then you argue that well, we shouldn't assume that they wouldn't try to monetize user data, shame on us. I agree that those who know how tech companies operate should be extremely pessimistic as to how users are treated, but I don't think that pessimism has permeated the public consciousness to quite the level you think it has. Moreover, I don't think it's a failing on the part of the user to assume that a company would do something in their best interest. It's a failing of the company to treat users as commodities whose only value is to be sold.