Comment by robinhouston
6 years ago
Zoom’s response to this[1] is a wonderful example of how not to respond to security issues. It includes the classic tropes:
* Our users don’t care about security.
> Our video-first platform is a key benefit to our users around the world, and our customers have told us that they choose Zoom for our frictionless video communications experience.
* We have no way of knowing if this has been exploited in the wild, so it’s probably fine
> Also of note, we have no indication that this has ever happened.
* Other products have the same vulnerability
> We are not alone among video conferencing providers in implementing this solution.
* We decided not to fix it
> Ultimately, Zoom decided not to change the application functionality
And also a lovely one I haven’t seen before:
* We tried to buy the researcher’s silence, but he refused
> Upon his initial communication to Zoom, the researcher asked whether Zoom provides bounties for security vulnerability submissions. Zoom invited the researcher to join our private paid bug bounty program, which he declined because of non-disclosure terms. It is common industry practice to require non-disclosure for private bug bounty programs.
1. https://blog.zoom.us/wordpress/2019/07/08/response-to-video-...
> Zoom invited the researcher to join our private paid bug bounty program, which he declined because of non-disclosure terms. It is common industry practice to require non-disclosure for private bug bounty programs.
Is an NDA really "common industry practice" for bug bounty programs? I know NDAs are common for pen-testing but it seems like an odd (and kind of dishonest) requirement for a bug bounty program.
Some kind of NDA terms are not unheard-of. Like a 1-3 month period in which to work on things during which disclosures won't go out.
That said, there's a slight disconnect between Zoom's two statements here. The first is that the researcher declined out of concerns over Zoom's NDA. The second is that NDAs are common. What this doesn't say is that Zoom's NDA is cookie-cutter or what the specific terms are.
If I were to guess, Zoom was using some unusual NDA and attempting to buy permanent silence.
Thanks for the explanation. That makes sense and seems pretty reasonable. The company should certainly have the opportunity to fix the vulnerability before it's made public and could be exploited.
> If I were to guess, Zoom was using some unusual NDA and attempting to buy permanent silence.
Considering that Zoom ultimately decided not to correct the issue I suspect you're right.
1 reply →
I'd have to guess this as well. I have dealt with a number of public and private bounties, and not one of the researchers has ever rejected an NDA or not allowed us time to remediate before they could disclose this information to 3rd parties. Unless you count Tavis tweeting critical findings I guess.
And to be fair, none of the times I've engaged a private bounty have been due to some massively critical bug that impacted privacy or could hijack parts of client systems. I could see that if the researcher worked with Zoom and didn't feel like they took it seriously they would refuse this and just disclose it due to the impact it has.
1 reply →
Will they pay the rest of your team and your spouse as well? "I've already sent these results to a few colleagues around the world to test out, but don't worry, they won't disclose anything for 90 days".
Also, they seem almost entirely focused on "unwittingly joining a meeting" as the real problem here, ignoring the fact that they have made the extremely poor choice of exposing a dodgy control API on your mac to the entire internet. What are the odds there are no bugs in this shitty little HTTP server they snuck onto everyone's machine? The fact that they came within five days of losing control of one of the domains that has the power to install arbitrary code on every mac running this thing is absolutely insane, and they should be asking themselves 1) how that happened, and 2) how utterly screwed they would have been if they lost control of that domain.
In a more amusing alternate universe, someone discovered the zoomgov.com vulnerability, waited until it expired, snapped it up, then published an "update" that uninstalls zoom entirely. In a nastier one, they used this idiotic design flaw to pwn every zoom client machine out there.
It's exactly how you want to respond if you plan on sharing it publicly on Twitter in the hopes of fooling those not in tech.
If my mom stumbled into that article, she would likely think they perfectly explained everything (well... she would likely contact me but, still).
Given this news is already not sticking near the top of hacker news and barely reported elsewhere, it feels like they are already getting away with it for the most part.
Problem is, your mom is not in their market; their paying customers have paid IT people who do pay attention.
And hence Zoom just caved: https://www.theverge.com/2019/7/9/20688113/zoom-apple-mac-pa...
>> Ultimately, Zoom decided not to change the application functionality
Yeah "functionality".
> All first-time Zoom users, upon joining their first meeting from a given device, are asked whether they would like their video to be turned OFF. For subsequent meetings, users can configure their client video settings to turn OFF video when joining a meeting. > Additionally, system administrators can pre-configure video settings for supported devices at the time of install or change the configuration at anytime.
TBH, they're not as dismissive as you're sounding them to be
That part just doesn’t seem very responsive. Unless Zoom is recommending that everyone should turn it OFF, and urgently releasing a patch to make OFF the default, why does it matter that the vulnerability is in an optional feature rather than a mandatory one?
The Zoom admin for an org can switch to cameras default Off
I agree it should be the default, though if you're worried you can open your Zoom app and change the default as well
That is a pre-existing feature, and while it mitigates one specific aspect of the issue, it doesn't represent a security-focused response. Yes, I am saying that's not good enough: an appropriate, non-dismissive response would commit to writing code to deal with the issue raised, subject to the industry standard 90-day embargo. Depending on how much importance they place on their user's security.