Comment by SulfurHexaFluri
5 years ago
The scary thing is that even though this sounds like a monstrous effort to pull off this hack, its not out of reach for large governments. Its basically known as a fact they have loads of these exploits sitting in their toolbox ready to use when they have a enticing enough target.
Short of rewriting the whole of iOS in a memory safe language I'm not sure how they could even solve this problem. Assigning a researcher to search for 6 months only to find one bug is financially prohibitive.
The research would've been much shorter if Apple would actually provide researchers with debug symbols. Or you know, if Apple open sourced their security-critical software.
> One of the most time-consuming tasks of this whole project was the painstaking process of reverse engineering the types and meanings of a huge number of the fields in these objects. Each IO80211AWDLPeer object is almost 6KB; that's a lot of potential fields. Having structure layout information would probably have saved months.
> Six years ago I had hoped Project Zero would be able to get legitimate access to data sources like this. Six years later and I am still spending months reversing structure layouts and naming variables.
It’s intensely frustrating, because for some reason Apple thinks it’s a good idea to strip out security code from the source that they do release (months late), and they tend to strip (and until recently, encrypt) kernel code. This is what a company from the last decade might do to hide security issues, except it’s coming from the world’s largest company with a highly skilled security team. Is there some old-school manager with so much influence that they’re able to override any calls from internal and external sources? It’s gotten to the point where Apple engineers privately brag about their new proprietary security mitigations after researchers who scrounge for accidentally symbolicated kernels (thank you, iOS 14 beta) do the work to find them. Why does this situation exist?
There were some Hacker News threads the other day about Marcan's Patreon campaign for porting Linux to Apple Silicon. Everyone basically expects that Marcan will need to reverse engineer everything on his own, and my gut tells me they're right.
But, if you actually stop and think about it for a moment... isn't this situation completely bizarre? Apple Silicon Macs explicitly support booting alternate OSs, because Apple went out of their way to add a `permissive-security` option to the boot-loader. They know Linux is important—the initial Apple Silicon reveal included a Linux VM demonstration—and now a well-known and talented developer is planning to do a native Linux port, at no cost to Apple, and we all fully expect that Apple won't make any documentation available or answer any questions? And, we're probably right?
The more I consider it, the more crazy it all seems. Why is Apple so private about the internals of their products? It won't affect marketing—normal consumers don't care—and I can't think of a plausible scenario where this type of information could help a competitor.
Is Apple using source code stolen from Oracle? Are they scared someone will discover an internal library written in COBOL and make fun of them? Are they worried their documentation could revive Steve Jobs as a vengeful ghost? I just don't get it.
9 replies →
I can only speculate, but Apple seems to have very tightly coupled software and hardware. Since this coupling probably holds trade secrets (which we don't know about by definition), it seems likely to me that they are controlling access to as much of the stack as they can while still protecting those secrets.
6 replies →
Because you're missing the other half of the exploit market: Selling vulnerabilities for big cheques.
https://zerodium.com/program.html
2 replies →
But doesn't it work in some ways? It's not going to save them, but it seems to significantly increase the time/cost of exploiting the vulnerability. One more layer to the security system.
4 replies →
> The research would've been much shorter if Apple would actually provide researchers with debug symbols.
I believe they're about to do this: https://www.theverge.com/2019/8/8/20756629/apple-iphone-secu...
And Google Project Zero won't get them.
https://twitter.com/benhawkes/status/1286021329246801921
> It looks like we won't be able to use the Apple "Security Research Device" due to the vulnerability disclosure restrictions, which seem specifically designed to exclude Project Zero and other researchers who use a 90 day policy.
6 replies →
These are just phones that you are officially permitted to attach a root shell and kernel debugger to, like to any other device that's not an iPhone. Researchers have been working around that for years by using private jailbreaks / exploits to get similar levels access, and with checkm8/ktrw you yourself can get similar access to any vulnerable iPhone 7/8/X.
No sources or structure layout or symbols, so you're still stuck waddling through megabytes of compiled code to reverse-engineer everything from scratch.
It's Apple drumming up absolutely nothing, and from my point of view it's mostly a PR stunt.
5 replies →
Believe it or not, open sourcing the security code is actually not a great idea. Most of the worlds bot nets run on Wordpress which is open source. Most of the time legitimate actors are not going to read through an entire code base because they have better things to do. Illegitimate actors however have a very high incentive to read through a widely used public code base and do so.
OpenBSD [0] is OSS, practices full disclosure, and is considered highly secure by... everyone.
Wordpress is a mess, but being OSS does not inherently make something less secure.
[0] https://www.openbsd.org/security.html
He could just have sent in a bug report. Said that the length was not validated.
No need to dig so much if you just want to fix the problem.
But he wanted to prove something. That is a different thing.
By 'wanting to prove something', he caused the vendor to act urgently, instead of sweeping this as a maybe-exploitable-maybe-not bug that would get lazily patched whenever.
By 'wanting to prove something', he showed the shortcomings of multiple security mitigations, all defeated by simple bugs.
By 'wanting to prove something', he also discovered two other exploitable 0days, that wouldn't have been discovered otherwise. Those 0days were likely already in the hands of bad actors, too.
Finally, the reason he even discovered the original bug is because Apple accidentally once or twice forgot to strip function names from a binary. If this didn't happen, that bug very likely would still be out there in the wild.
I'm not sure you understand how security research works.
This is a weird statement, since the premise of this blog post is that these kinds of attacks aren't out of reach for a single talented researcher on a Google salary. It's not out of reach for any government. Nauru, Grenada, Tonga, the Comoros --- they can all afford this.
I believe the point of SulfurHexaFluri's final sentence is that it is cost prohibitive for Apple to dedicate a bunch of employees to search for bugs in order to fix them all. That is, it's cost-effective to find 1 bug, but not to find all of them. The sentence could have been worded better.
I'd personally phrase things a bit differently: an _individual_ was able to pull this off while surrounded by screaming children. A large government, with all its resources and hundreds+ of people, would pull this off regularly and without breaking a sweat.
> Short of rewriting the whole of iOS in a memory safe language I'm not sure how they could even solve this problem. Assigning a researcher to search for 6 months only to find one bug is financially prohibitive.
Note that memory safe languages won't solve security. They only eliminate a class of security bugs, which would be amazing progress, but not all of them.
The OP sounds like it was a memory-safety bug, so this is a bit pedantic.
Didn't they move WiFi drivers, among other things, into the userspace in macOS Big Sur? I've heard somewhere that they're going in the direction towards microkernel for this particular reason of reducing the attack surface.
(yes I know I'm talking about macOS but the vulnerability was found in iOS, but there's a lot of shared code between them, especially on levels this low)
>Its basically known as a fact they have loads of these exploits sitting in their toolbox ready to use when they have a enticing enough target.
Do you have a source for this?
Google "NSA TAO" -- Tailored Access Operations. AIUI, among other things they're responsible for developing, discovering, and weaponizing exploits used to access high value targets -- sometimes through fun techniques like "Quantum Insert", a sort of faster-man-in-the-middle attack. The wealth of exploits released in the equation group hack should put all doubts to rest.
Spot on. I expect this was a designed-in feature, but if I could prove it, I wouldn't be able to do so without going to jail.
1 reply →
There's a market for exploits that pays pretty well. Someone is throwing millions of dollars at them, and from what we can glean from investigations, leaks and whistle blowers, it's states that are buying them. One company in that space made world-wide news[1] by selling to governments.
[1] https://en.wikipedia.org/wiki/Hacking_Team
>[1] https://en.wikipedia.org/wiki/Hacking_Team
Also a good idea to DDG Phineas Phisher. You should turn up an interesting read on pastebin iirc.
Edit: found it on exploit-db
[0] https://www.exploit-db.com/papers/41915
6 replies →
The whole NSA leaks thing proved it. They had a tool built for exploiting windows boxes which was leaked and converted in to the ransomware WannaCry which spread globally a few years ago.
The NSO Group, the Israeli team behind the Pegasus iOS spyware, have been accused of selling it to the UAE government.
https://www.haaretz.com/middle-east-news/.premium-with-israe...
Interview with a nation state hacker for TAO at NSA.
https://podcasts.apple.com/us/podcast/darknet-diaries/id1296...
I believe they described their toolbox as metasploit on steroids. Some other episodes of darknet diaries also interview former and current government hackers.
Official website with full transcript + some nice pixel art: https://darknetdiaries.com/episode/10/
Who do you think the customers of ZDI, Zerodium, Azimuth and others are?
https://en.wikipedia.org/wiki/EternalBlue
It is not just not out of reach for large governments, it probably not even out of reach for most organizations with between 5-10 people. As the author says, 6 months of "one person, working alone in their bedroom, was able to build a capability which would allow them to seriously compromise iPhone users they'd come into close contact with". Even if we assume the author is paid $1,000,000 a year that is still only $500,000 of funding which is an absolute drop in the bucket compared to most businesses.
The average small business loan is more than that at $633,000 [1]. Hell, a single McDonalds restaurant [2] costs more than that to setup. In fact, it is not even out of the reach of vast numbers of individuals. Using the net worth percentiles in the US [3], $500,000 is only the 80th percentile of household net worth. That means in the US alone, which has 129 million households, there are literally 25.8 million households with the resources to bankroll such an effort (assuming they were willing to liquidate their net worth). You need to increase the cost by 1,000x to 10,000x before you get a point where it is out of reach for anybody except for large governments and you need to increase the cost by 100,000x to 1,000,000x before it actually becomes infeasible for any government to bankroll such attacks.
tl;dr It is way worse than you say. Every government can fund such an effort. Every Fortune 500 company can fund such an effort. Every multinational can fund such an effort. Probably ~50% of small businesses can fund such an effort. ~20% of people in the US can fund such an effort. The costs of these attacks aren't rookie numbers, they are baby numbers.
[1] https://www.fundera.com/business-loans/guides/average-small-...
[2] https://www.mcdonalds.com/us/en-us/about-us/franchising/new-...
[3] https://dqydj.com/average-median-top-net-worth-percentiles/
For those who don't see why a company would want to use such exploits, consider how valuable it would be to know if a company's employees were planning to organize or strike.
There are also paranoid people in positions of power, and bureaucracies that can justify spying on employees. One of the interesting things about this lockdown was finding out that many companies put spyware on their employee-issued computers to monitor their usage.
How is it financially prohibitive to pay a researcher a salary to find a 0day like this, where the bounty programs pay $100k-$500k on 0days on the same ? source: https://developer.apple.com/security-bounty/payouts/