Kaspersky finds hardware backdoor in 5 generations of Apple Silicon (2024)

1 year ago (xstore.co.za)

Thread at the time https://news.ycombinator.com/item?id=38783112

  • That thread also has the benefit of using the original source article from Kaspersky, which is worth a read. This blog post notably disagrees with Kaspersky's own conclusions about whether it's an intentional backdoor, instead citing Steve Gibson, who has clearly learned nothing from his previous "WMF backdoor" debacle.

Anyone who is paranoid about hardware backdoors might enjoy this:

https://www.contrib.andrew.cmu.edu/~somlo/BTCP/

  • Thanks for sharing. But I think there are easier solutions if one works for sensitive projects such as an Apple hardware designer or a Google Android kernel programmer:

    1. Complete separation of work and personal computers and cellphones.

    2. Company cellphones only stay in the facility and are checked for vulnerabilities from time to time.

    3. No bragging in public about your project or messing with other women/men other than one's own partner, so one does not expose personal vulnerabilities.

I read the original Kaspersky analysis and found it very weird that such a cyber security company that works with the Russian government closely allows US made phones accessing their networks as late as 2023 Dec.

  • >that works with the Russian government closely

    There's never been any real and substantial evidence for this and much to the contrary. They've moved a lot of their infra out of Russia and have for ages been early on malware that originated from Russia and allies.

    It's one of those things that if American media writes about it long enough people somehow just assume its true.

    • Be it may the case, I highly suspect that is the case by looking at the efforts the other party spent to make it work. Multiple zero days plus maybe some pulling strings in Apple.

  • Your options are iPhone or Android if you want a reasonably usable phone in 2025. And iPhone is considerably more secure than Android against both script kiddies and nation state attackers.

    • Just make sure nobody ever sends you an SMS or 'iMessage' as those have a wild history of enabling remote 'zero-click' take-overs. If you doubt this just search for 'imessage vulnerability' or 'imessage cve'. Android has far fewer of these problems, partly due to it being a more diverse system where any single vulnerability is less likely to apply to all Android installs. Of course this diversity also means there are more chances to find problems but the reach of those problems is smaller.

    • > And iPhone is considerably more secure than Android against both script kiddies and nation state attackers.

      Posting this in a thread about a HW backdoor in iPhone seems strange. And there are also a lot of noclick exploits in the Apple ecosystem: NSO comes to mind.

      My main issue with Apple is that they, internally, do not do any security research. They just close the holes, if, and after, they are discovered.

    • If they really need the security, considering how the other party spent such trouble to hack their phones, this is probably true, then they should not allow any smart phone into the facility.

      This has been done many times before by other companies. Huawei used to do a lot of closed door development -- every one of the team lives in a hotel for a few months without phones and cannot get out. If your adversary burnt so many zero days and maybe also pulled some strings to hack you, you absolutely should do this.

      1 reply →

According to this blog it has been patched. But it really does open up the question of how much do we trust Apple, Google and other large tech companies.

I always assumed, not having worked at Apple, but from the observed functionality and the fact that they could patch it, that this was a debug backdoor that didn't get killswitched before release builds and then they decided it would draw attention to it if they killed it after the fact.

You have to wonder if the only reason the iPhone 16 isn’t included in this article, is because the article was written before the iPhone 16 existed.

  • The iPhone 16 shipped with iOS 18. The vulnerability in question (CVE 2023-38606) was patched with iOS 16.6 released in July 2023, months before Kaspersky's write-up that prompted this blog post. There, now you don't have to wonder any more.

  • It's because Apple fixed the issue on all affected devices with OS updates released in July 2023.

    • Has anyone disassembled that update to figure out how they patched this?

      If it is some device sitting on the memory bus, how did they disable it in a way it couldn't be reenabled by the OS kernel? Most hardware that sits on a CPU bus doesn't have such an ability.

I wonder if something like this is behind the push from Microsoft to obsolete a lot of hardware with the windows 11 release. The NSA pushed them to require a hardware upgrade so people replace devices bearing old processors with new ones featuring the latest bleeding-edge backdoors.

  • What if your comment is a part of a psyop to keep paranoid people (NSA's true target) on their old devices which are even easier to breach?

    • I do notice that a lot of enemies of the state seem to use poorly secured platforms. Everything from Hamas using pagers to widespread use of unencrypted telegram groups and discord, and the ANOM sting with a non-e2e app.

      Yet platforms with apparently secure e2e messaging (ie. WhatsApp) never seem to be used by criminals.

      I wonder if this is just selection bias in the criminals caught, or if there is some forcing factor persuading criminals to make poor security choices.

      2 replies →