← Back to context

Comment by fsflover

4 years ago

You are of course right that technically it has the access to the RAM.

> Supposedly it should be quiesced after training, but I haven't seen any security analysis that claims that firmware couldn't just take over the system while it runs.

I was under impression that it was the whole point of the exercise. It would be interesting to know otherwise.

> even though the blob ends up running on the same CPU with the same privileges in the end anyway

This is not how I understood it. The Librem 5 stores these binary blobs on a separate Winbond W25Q16JVUXIM TR SPI NOR Flash chip and it is executed by U-Boot on the separate Cortex-M4F core. From here: https://source.puri.sm/Librem5/community-wiki/-/wikis/Freque....

> I was under impression that it was the whole point of the exercise. It would be interesting to know otherwise.

It absolutely wasn't. Look into it. In every case, the blob ends up running on the RAM controller CPU and supposedly finishes running and is done. The whole point of the exercise was obfuscating the process which is used to get to that point such that it avoided the main CPU physically moving the bits of the blob from point A to point B. Really.

> This is not how I understood it. The Librem 5 stores these binary blobs on a separate Winbond W25Q16JVUXIM TR SPI NOR Flash chip and it is executed by U-Boot on the separate Cortex-M4F core.

That is incorrect (great, now they either don't know how their own phone works or they're lying - see what I said about obfuscation? It's great for confusing everyone).

The M4 core code is not proprietary; it's the pointless indirection layer they wrote and it is not loaded from that SPI NOR flash. It's right here:

https://source.puri.sm/Librem5/Cortex_M4/-/tree/master

That open source code, which is loaded by the main CPU into the M4 core, is responsible for loading the RAM training blob from SPI flash (see spi.c) and into the DDR controller (see ddr_loader.c).

The actual blob then runs on the PMU ("PHY Micro-Controller Unit") inside the DDR controller. This is an ARC core that is part of the Synopsys DesignWare DDR PHY IP core that NXP licensed for their SoC. Here, cpu_rec.py will tell you:

  firmware/ddr/synopsys/lpddr4_pmu_train_2d_imem.bin
      full(0x5ac0)   ARcompact                          chunk(0x4e00;39)    ARcompact 

The normal way this is done is the DDR training blob is just embedded into the bootloader like any other data, and the bootloader loads it into the PMU. Same exact end result, minus involving a Cortex-M4 core for no reason and minus sticking the blob in external flash for no reason. Here, this is how U-Boot does it on every other platform:

https://github.com/u-boot/u-boot/blob/master/drivers/ddr/imx...

Same code, just running on the main CPU because it is absolutely pointless running it on another core, unless you're trying to obfuscate things to appease the FSF. And then the blob gets appended to the U-Boot image post-build (remember this just gets loaded into the PMU, it never touches the main CPU's execution pipeline):

https://github.com/u-boot/u-boot/blob/master/tools/imx8m_ima...

Purism went out of their way and wasted a ton of engineering hours just to create a more convoluted process with precisely the same end result, because somehow all these extra layers of obfuscation made the blob not a blob any more in the FSF's eyes.

The security question here is whether that blob, during execution, is in a position to take over the system, either immediately or somehow causing itself to remain executing. Can it only talk to the RAM or can it issue arbitrary bus transactions to other peripherals? Can it control its own run bit or can the main CPU always quiesce it? Can it claim to be "done" while continuing to run? Can it misconfigure the RAM to somehow cause corruption that allows it to take over the system? I have seen no security analysis to this effect from anyone involved, because as far as I can tell nobody involved cares about security; the whole purpose of this exercise obviously wasn't security, it was backdooring the system into RYF compliance.

  • > great, now they either don't know how their own phone

    Who's "they"? This is an unofficial community wiki.

    • If it's a random community member then this is just further evidence that the way Librem presented things and what they did confused people into thinking it actually had a practical purpose, when it was purely a way to rules-lawyer their way into getting RYF.

      8 replies →