← Back to context

Comment by munificent

6 years ago

Bias up front: I work at Google but am not speaking for Google.

> involuntary, unpaid guinea pigs.

I don't see how this is involuntary. You are choosing to use the product. If you choose to use the product, yes, you may be exposed to features that the product has. If you don't want to be exposed to those features, the way to opt out is to not use the product.

> What's the motivation?

It lets the company incrementally roll out and test features in real-world network configurations at scale. As far as I know, almost all tech companies do this.

Let's say you're Fapplebooglezon and you have an idea to put kitten emojis on the "Buy Now" button. Before you ship that, you want to make sure that:

1. The feature works correctly. It doesn't crash or have significant performance problems.

2. Users, in aggregate, like the change. No one wants to ship a "New Coke" debacle. It's bad for the company (they lose money) and bad for users (they don't like the product).

3. Your servers and network can handle the consequences of that change. Maybe users will be so excited that they all click "Buy Now" twice as much. You need to make sure your servers don't crumble under the increased load.

These are reasonable things that benefit both the company and users. So the way features and changes are usually shipped is like:

1. The feature is implemented behind some kind of flag. [0]

2. "Fishfooding" [1]: The team developing the feature starts using it. This gives you some feedback on "does the feature work correctly" but that's about it. The team owns the feature, so they are biased in terms of its usability. And they are on a privileged network and not a large enough population to verify how this affects the distributed system.

3. "Dogfooding": The entire company starts using it. This starts to give you some usability feedback because now people who don't have a stake in the feature are being exposed to it. But it's still skewed since employees are likely not a representative user population.

4. "Canary": The feature is enabled for a randomly selected small population of external users. Now you start getting feedback on how the feature performs in the wild on real-world machines and networks. The percent of users is kept small enough to not crush the servers in case anything goes awry, but you can start getting some performance data too.

5. "A/B testing": Now you start collecting data to see how behavior of users with the feature compares to users without it. You can actually start to get data on whether the feature is good or not.

6. Assuming everything looks OK, you start incrementally rolling it out to a larger and larger fraction of users. All the while, you watch the servers to make sure the load is within expected bounds.

7. Once you get to 100% of users and things look good, you remove the flag and the feature is now permanently enabled.

> Is it simple laziness because they don't want to deal with wetware?

Google, like most other companies, also does lots of user testing and user surveys too. But that doesn't give you insight into the technical side of the question — how the feature impacts the behavior of your distributed system.

You may not be aware of this, but this kind of in-the-wild product testing is something almost all businesses do, all the time. Food companies test new products in grocery stores in selected cities [2]. Car manufacturers drive camoflaged prototypes on the road [3]. Restaurant chains tinker with recipes to see how sales are affected. There is absolutely no guarantee that the Coke you're drinking today has the same ingredients as the one you had yesterday.

You seem to think this is some nefarious scheme, but it's just basic marketing. You want to make a thing people like, so you make two things and measure which one people like more. People "opt in" and "consent" by using the product. If you don't want to be a "guinea pig" when McDonald's changes their French fry recipe, don't buy the fries. If you don't want to test out new Chrome features, don't use Chrome.

[0]: https://martinfowler.com/articles/feature-toggles.html

[1]: https://www.reddit.com/r/google/comments/3qpdnn/anyone_knows...

[2]: https://smallbusiness.com/product-development/best-u-s-citie...

[3]: https://www.cnbc.com/2017/01/20/camouflage-the-incognito-way...

I don't see how this is involuntary. You are choosing to use the product

It's involuntary because it's not informed consent. Google doesn't tell people up front or in any meaningful way that this is happening.

That's like saying "Oh, that steak was covered in the chef's experimental hot sauce that we didn't list on the menu? Well, too bad, you chose to come to this restaurant."

  • > It's involuntary because it's not informed consent.

    I think you're making an analogy that doesn't logically apply. "Informed consent" is a property of healthcare administration. When you're putting drugs into someone's blood stream or cutting them open while anaesthetized, yeah, you need to make damn sure you're doing the right thing for them.

    > the chef's experimental hot sauce that we didn't list on the menu?

    Likewise, when you're serving food that someone will ingest and which may cause allergic reactions or food poisoning, again the bar is pretty high to make sure you are treating people safely.

    But we're talking about using a free piece of software. If Chrome changes the color of their tab bar, no one is going into anaphylactic shock. When Facebook adds a new button on the sidebar, there is little risk of that inadvertently severing someone's carotid artery.

    • >I think you're making an analogy that doesn't logically apply. "Informed consent" is a property of healthcare administration.

      No, it is used in healthcare, but it is by no means exclusive to that domain.

      And the European Union clearly has a different opinion when it comes to the use of personal data:

      "Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement. This could include ticking a box when visiting an internet website, choosing technical settings for information society services or another statement or conduct which clearly indicates in this context the data subject's acceptance of the proposed processing of his or her personal data. Silence, pre-ticked boxes or inactivity should not therefore constitute consent. Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. If the data subject's consent is to be given following a request by electronic means, the request must be clear, concise and not unnecessarily disruptive to the use of the service for which it is provided."

      Recital 32 of the GDPR

If I use a bunch of older Chromes from portableapps, are those affected by feature testing, provided I've disabled google update but I'm not behind a firewall?

In other words, is feature polling just hard-coded or it is bound to a specific installation?