Reading between the lines of TFA, it seems the researchers may also suspect that to be the case:
> Our guess is that this unknown hardware feature was most likely intended to be used for debugging or testing purposes by Apple engineers or the factory, or that it was included by mistake. Because this feature is not used by the firmware, we have no idea how attackers would know how to use it.
However, keep in mind that this level of "bugdooring" is possible without Apple's explicit cooperation. In fact, the attackers don't even need to force a bug into the code. It would probably be sufficient to have someone on staff who is familiar with the Apple hardware development process (and therefore knows about the availability of these tools), or to simply get a copy of the firmware's source code. Sophisticated attackers likely have moles embedded within Apple. But they don't even need that here; they could just hire an ex-Apple employee and get all the intel they need.
This is the same conspiracy mindset of flat earthers, and you deserve your own netflix mockumentary over it.
Because a bug is a bug, it's very nature means you cannot prove it isn't malicious, therefore you take it as positive proof of malice and sit pretty bc no one can prove a negative.
I always get weird consultants reaching out to me on LinkedIn asking for deets on my org's layout and - curiously - our tech stack. They offer something like $500+ an hour but I don't want to be complicit in some compromise. Private intelligence is such a fascinating industry.
Since they've gone to the trouble of protecting it with an insecure hash, couldn't they also have designed this hardware feature so that it could be completely disabled until the device is rebooted? This vulnerability doesn't persist through reboots, so it would be sufficient to have the firmware lock the feature out during startup outside of development or manufacturing contexts.
I just dont get this mentality. Here is proof positive (if you believe attribution) that the NSA is using exquisite and exotic techniques to force their way into iphones and you look at it and come up with the exact opposite conclusion that Apple is letting them into the iphone. Its not a backdoor if you're smashing in the window.
Based on past history, it would be more surprising if Apple wasn't actively cooperating with the NSA, that was the case with PRISM (wiki):
> "The documents identified several technology companies as participants in the PRISM program, including Microsoft in 2007, Yahoo! in 2008, Google in 2009, Facebook in 2009, Paltalk in 2009, YouTube in 2010, AOL in 2011, Skype in 2011 and Apple in 2012. The speaker's notes in the briefing document reviewed by The Washington Post indicated that '98 percent of PRISM production is based on Yahoo, Google, and Microsoft'"
With the rise of end-to-end encryption in the wake of the Snowden revelations, this put large tech corporations in a bind, given the conflict between consumer desire for secure snoop-proof devices, and government desire for backdoor access. Pressure might have been applies by government contracting decisions, so no cooperation == no big government contract. The general rise of end-to-end encryption also meant that things like deep packet inspection along the trunk no longer worked, putting a premium on breaking into devices to install keyloggers etc.
All the fear of China doing this with Huawei (probably well-justified fear) may have risen in part as projection by politicians and insiders who knew the US government was doing it already with Apple, Android, Intel, ARM, etc. The US government has certainly retained legalistic justification for such behavior, even though the Act expired in 2020[1]. Also, corporations have been given retroactive immunity for similar illegal activites before [2], so Apple has that precedent to go by.
Reading between the lines of TFA, it seems the researchers may also suspect that to be the case:
> Our guess is that this unknown hardware feature was most likely intended to be used for debugging or testing purposes by Apple engineers or the factory, or that it was included by mistake. Because this feature is not used by the firmware, we have no idea how attackers would know how to use it.
However, keep in mind that this level of "bugdooring" is possible without Apple's explicit cooperation. In fact, the attackers don't even need to force a bug into the code. It would probably be sufficient to have someone on staff who is familiar with the Apple hardware development process (and therefore knows about the availability of these tools), or to simply get a copy of the firmware's source code. Sophisticated attackers likely have moles embedded within Apple. But they don't even need that here; they could just hire an ex-Apple employee and get all the intel they need.
well of course nobody would have NSA_friendly_override() in the source
plausible deniability is essential in such cases, hence the term bugdoor
This is the same conspiracy mindset of flat earthers, and you deserve your own netflix mockumentary over it.
Because a bug is a bug, it's very nature means you cannot prove it isn't malicious, therefore you take it as positive proof of malice and sit pretty bc no one can prove a negative.
3 replies →
I always get weird consultants reaching out to me on LinkedIn asking for deets on my org's layout and - curiously - our tech stack. They offer something like $500+ an hour but I don't want to be complicit in some compromise. Private intelligence is such a fascinating industry.
Since they've gone to the trouble of protecting it with an insecure hash, couldn't they also have designed this hardware feature so that it could be completely disabled until the device is rebooted? This vulnerability doesn't persist through reboots, so it would be sufficient to have the firmware lock the feature out during startup outside of development or manufacturing contexts.
> This vulnerability doesn't persist through reboots
I suspect, once you stop receiving data from the device, you just text it the invisible message every few minutes until you start getting data again.
I just dont get this mentality. Here is proof positive (if you believe attribution) that the NSA is using exquisite and exotic techniques to force their way into iphones and you look at it and come up with the exact opposite conclusion that Apple is letting them into the iphone. Its not a backdoor if you're smashing in the window.
Sure, it's not like we've been made aware of a history of backdoors through the Snowden or Shadowbrokers leaks...
Yeah I don’t understand these conspiracy theories.
If the NSA had partnered with Apple, they sure as hell would have asked for something much more convenient and resilient.
I think it’s just down to a lot of people not understanding hardware and falling back to “magical” thinking
Based on past history, it would be more surprising if Apple wasn't actively cooperating with the NSA, that was the case with PRISM (wiki):
> "The documents identified several technology companies as participants in the PRISM program, including Microsoft in 2007, Yahoo! in 2008, Google in 2009, Facebook in 2009, Paltalk in 2009, YouTube in 2010, AOL in 2011, Skype in 2011 and Apple in 2012. The speaker's notes in the briefing document reviewed by The Washington Post indicated that '98 percent of PRISM production is based on Yahoo, Google, and Microsoft'"
With the rise of end-to-end encryption in the wake of the Snowden revelations, this put large tech corporations in a bind, given the conflict between consumer desire for secure snoop-proof devices, and government desire for backdoor access. Pressure might have been applies by government contracting decisions, so no cooperation == no big government contract. The general rise of end-to-end encryption also meant that things like deep packet inspection along the trunk no longer worked, putting a premium on breaking into devices to install keyloggers etc.
All the fear of China doing this with Huawei (probably well-justified fear) may have risen in part as projection by politicians and insiders who knew the US government was doing it already with Apple, Android, Intel, ARM, etc. The US government has certainly retained legalistic justification for such behavior, even though the Act expired in 2020[1]. Also, corporations have been given retroactive immunity for similar illegal activites before [2], so Apple has that precedent to go by.
[1] https://www.cjr.org/the_media_today/section_702_renewal_pres...
[2] https://www.aclu.org/news/national-security/retroactive-tele...
NSA will have the same special relationship with Apple as they do with AT&T.