Comment by nerdsniper
1 day ago
The cable can report what it "thinks" it is, and in fact, modern USB-C cables do this: they have "e-Marker chips" inside the plugs which communicate with whatever they're plugged into and enumerate their belief as to their capabilities. The thing is, manufacturers can set the e-Marker chips to spew lies, or a cable that used to support 80Gbps got slightly damaged after 6 months of use and now only reliably transmits 10Gbps.
Power capacity is relatively easy to measure ad-hoc via voltage drop from one end to the other...USB-PD controllers already do this and can even fine-tune the voltage to make sure that if the device receiving (sinking) power needs 20V they'll send 20.4V or 20.9V to compensate for voltage drop so that the charging device gets 20V on its end.
But actual maximum data throughput is hard to know. The only way to really "know" how much data can flow through a cable is with an expensive oscilloscope or cable tester. Because 80Gbps cables run at ~13GHz so, at minimum you need a 26GHz scope (Nyquist–Shannon sampling theorem) or more practically a 52GHz scope. And it turns out it's really expensive to measure electrical signals 52 billion times per second. The necessary devices start at $15,000 (cable signal integrity tester) [0] on the very low end and only work for max 10Gbps USB 3.2 cables, or past $270,000 for 80Gbps USB4 cables (proper 60GHz oscilloscope) [1].
On the high end, each signal integrity test device can actually cost $1-2 million [2] where the base unit starts at $670,000 plus then spending additional money for hardware-accelerated analysis, specialized active probes, and the specific PAM-3 / USB4 compliance software packages.
0: https://www.totalphase.com/products/advanced-cable-tester-v2...
1: https://www.edn.com/12-bit-oscilloscope-operates-up-to-65-gh...
2: https://www.eevblog.com/forum/testgear/uxr1104a-infiniium-ux...
This is overthinking it a bit. You mostly only need that stuff to tell you why it isn't working. If you want to know if it's up to the job, you can just measure the error rate, which just means sending a lot of data across and counting the errors. There might be some faults which only occur when the cable is in a particular position, but you can at least detect it when it happens.
The interface IC almost certainly also estimates signal quality, but it's likely hard to get that information out of it.
> modern USB-C cables [...] have "e-Marker chips" inside the plugs
If only they all did. I have a significant percentage in my pile with no e-Marker chip. They'll be the first to be culled once I get around to that, mind.
I get that to properly test a cable, you need that level of accuracy, but for home use, couldn’t you get away with a source and a receiver that are far cheaper?
If a USB4 device can output a USB4 stream and the receiver can check that stream for errors, isn’t that sufficient?
At some point you end up testing the peripheral and/or host rather than the cable. For example, cables often state that they can handle up to 240W ... but no 240W USB-PD chip has ever gone into production -- you won't even find one at the hottest USB-PD trade shows[0] in China.
It could be reasonable for computers to be allowed to trigger a data throughput test and the peripheral would state "I support up to 40Gbps of receiving/sending", and then send a simple pattern that can be generated on the fly. But a lot of devices can't receive/send that 80Gbps of data for long enough to perform a decent test - the storage, RAM, buffers, etc get depleted or act as bottlenecks.
If you know enough to accurately interpret the measurements you get from that, you know enough to write your own computer program to try to send 80Gbps from one computer to another and use DMA to process it in real-time without hitting storage (which a lot of peripherals likely don't have the CPU to accomplish).
If you don't know enough to write those test applications, you probably don't know enough to interpret the results of a built-in test function and the measurements would confuse and frustrate a lot of well-meaning, nerdy, but under-educated consumers who make assumptions about why they're not actually getting the rated speed.
Idk, my opinion doesn't go one way or the other here. Perhaps I myself don't quite know enough to be a good judge of that concept.
0: https://asiachargingexpo.com
> For example, cables often state that they can handle up to 240W ... but no 240W USB-PD chip has ever gone into production -- you won't even find one at the hottest USB-PD trade shows[0] in China.
Your information is out of date. You can buy 240W chargers from Framework which I assume are just rebranded Delta chargers:
https://frame.work/products/power-adapter-240w
The Framework 16 supports this 240W charging input, as well.
I think you’re overthinking the bottleneck side of things: RAM to RAM would be sufficient to capture if the cable is capable of 40Gbps.
All an end user cares about is if the cable is the bottleneck, if you think you have known-good devices. If I have a MacBook and a good NVMe enclosure, I want to know if my cable is fast enough, rather than have it quietly fall back to 3.2 or worse.
You don't need to test at 240W. You primarily need to test that it can handle 5 amps with limited voltage drop. You can also test that it handles 48 volts but basically any cable can handle 48 volts. The chance that either one of those very mild operating conditions compromises the other when you combine them is minimal.
>no 240W USB-PD chip has ever gone into production
This is because the cross-sectional-area of the conductor would create an inflexible cable – and even then the connector (even though rated) could never handle a sustained 240W in the real world.
Fires. Fires everywhere... this is why no 240W chip exists.
src: electrician
4 replies →
Super helpful -- integrated this into the guide. Thank you.