← Back to context

Comment by gowld

6 years ago

Note: "Zoom" is a videoconfrerencing app, not a built-in Mac OS accessibility feature for "zoom".

The article does not clearly state this, ceding a plain English word to a corporation, enabling a takeover of human language.

P.S.: This part

> Apr 26, 2019 — Video call with Mozilla and Zoom Security Teams

is funny, and would be way funnier if it was an non-consensual video call.

Finally, note that Zoom effectively does not pay for bug bounties, so researchers should think twice about donating their expertise to a selfish for-profit corporation, and users should think twice about using a videochat product that allows its entire security team to take blackout vacations, and also doesn't pay its outsourced sercurity researchers.

Finally, note that Zoom effectively does not pay for bug bounties, so researchers should think twice about donating their expertise to a selfish for-profit corporation

I've read this a few times and am curious if this has really become the prevailing view about what security researchers are doing (i.e., uncompensated labor) when they notify vendors about security vulnerabilities.

The traditional view (which I think was widespread in the 90s or whatever) was that engineers who find vulnerabilities in products have a special responsibility to the public, and owe a duty to the people at risk: the users of the product (or whoever would be harmed if the vulnerability were exploited to malicious ends). Just like if you used your training as an engineer to discover that the Bay Bridge had a structural flaw and that drivers were at risk (or, in the case of Diane Hartley, that the new Citicorp Center had a design flaw and officeworkers were at risk). And this duty can be discharged a few ways, but often the most efficient way to help the people at risk is to educate the vendor and keep on their ass until they fix the problem in a free update. If the vendor pays you, fantastic, but you shouldn't accept payment that would prevent you from discharging your duty to the people actually harmed by the vulnerability's existence (e.g., if you take the vendor's money and it comes with an indefinite NDA, and they never fix the problem and the users remain at risk of being harmed by bad actors forever, you have not behaved acceptably as an engineer). This view probably emerged at a time when bug-finders mostly had salaried jobs and were privileged not to have to depend on payments from the same vendors they were annoying with information on their product's flaws.

A newer view (probably informed by bug bounties, etc., and also a broader community of people doing this stuff) seems to "no more free bugs for software vendors" -- that researchers who find vulnerabilities in commercial products are producing knowledge that's of value to the vendor, and the vendor ought to give them compensation for it, and if the vendor doesn't want to do that, the researcher would basically just be doing uncompensated labor to give it to the vendor, and is free to go sell the fruits of their discovery to somebody who does value their labor instead. Even if that means selling the bug to unknown counterparties at auction and signing a forever NDA not to tell anybody else.

The first view is mostly what we teach students in Stanford's undergrad computer-ethics course and what I think is consistent with the rest of the literature on engineering ethics (and celebrated examples like Diane Hartley and William LeMessurier, etc.), but I do think it seems to be out-of-step with the prevailing view among contemporary vuln-finders. I'd love to find some reading where this is carefully discussed that we could assign students.

  • I can't imagine selling bugs to the highest bidder ever becoming ethically acceptable. You can't pretend not to know that the high bidder is probably a cybercriminal. If you do this, your hat is clearly black.

    Once upon a time, vulnerabilities were just nuisances and people could justify some gray-hat casuistry when the damage was just some sysadmin overtime to clean up. But now there are serious organized crime rings and rogue nation-states using vulnerabilities to steal and extort billions and ruin people's lives.

    It's OK to choose not to work on products with no bug bounties, but if you do find a bug in one you must disclose it responsibly.

    • >you must disclose it responsibly.

      While most people agree selling a vulnerability is immoral, there is much debate on whether "full disclosure" is ok, and whether "responsible disclosure" is a term anyone should ever say (some argue the correct term is "coordinated disclosure").

      https://news.ycombinator.com/item?id=18233897

  • The first view meets some sort of ideal (I guess) but causes all sorts of free riding problems. In larger society these sorts of problems are solved through regulations. For example if someone identifies a structural vulnerability in a bridge, the agency in charge of the bridge has a legal obligation to take steps to fix it. That sort of regulation doesn't exist in software land.

    The second view as you describe it (selling to the highest bidder) is clearly black hat, but it is completely ethical for a researcher to disclose a vulnerability to the public if the vendor doesn't fix it in a reasonable amount of time. So Project Zero and this disclosure are both fine. Yes, ordinary users may be harmed in the crossfire, but the vendor should be liable for damages.

  • Beyond just a prevailing "view", this duty to public safety is actually explicitly codified in the laws and regulations of most professional engineering organizations. To act otherwise would be a) unethical and subsequently b) grounds for loss of license to practice.

  • I would say the 'first view' you've described is what the bulk of professionals in the information security industry would still espouse as the ideal.

    In my opinion this second view you are observing is carried by a vocal minority of participants in bug bounty programs and would be good fodder for a computer-ethics course.

  • They’re donating their expertise because, yes, this research is extremely valuable and important, but the vendor should obviously be paying for it.

  • I feel like selling bugs to the highest bidder is usually ethically questionable, no matter how “new” your viewpoint is.

>The article does not clearly state this, ceding a plain English word to a corporation, enabling a takeover of human language.

The English language can handle it:

proper noun

- A noun belonging to the class of words used as names for unique individuals, events, or places.

- A noun denoting a particular person, place, organization, ship, animal, event, or other individual entity.

- A noun that denotes a particular thing; usually capitalized

>The article does not clearly state this, ceding a plain English word to a corporation, enabling a takeover of human language.

I agree with your outrage, but you have a long way to go. That sort of behavior is the soup du jour of SV the past ten years or so.

Keep fighting the good fight. I've given up, but I hope you win.

  • 10 years is nothing on the scale of language development, and SV is nothing on the scale of the English-speaking world :) Fear not, I bet there aren't enough words to go around for this to be a big deal long-term.

> Offered and declined a financial bounty for the report due to policy on not being able to publicly disclose even after the vulnerability was patched.

They seem to pay bug bounties if you agree to keep it down.

> The article does not clearly state this, ceding a plain English word to a corporation, enabling a takeover of human language.

English usually wins.

It's pretty well known in white hat circles that Zoom has a paid private bounty program through one of the "big 2". I know several who have got paid. Say what you like about non-disclosure, but it is the reality for most programs. We can disclose for pay, or disclose for fame, but usually not both.