Having a standard plug is great, I hope we stick with it for decades and gradually the situation will improve as everyone gets used to the standard.
USB-C gets rid of all the stupid previous decisions on the physical connectors (orientation required but not obvious, fragile clips, too large, too small), the physical side of things is now set and hopefully all devices, chargers and outlets will now converge on usb-c.
Yes getting the right cable can make a difference but the situation is so much better than before, partly because phone manufacturers were forced by the EU to adopt one connector early one. I’m so glad Apple’s proprietary connector is gone.
> I’m so glad Apple’s proprietary connector is gone.
Apple made Lightning when the rest of the world was still mucking about with Micro-USB, which I would argue is just about the worst connector ever in common use. The only type of cable where I routinely kept a half dozen on hand because they failed so damn often.
I do like USB-C, but despite being superior (physically) on paper, it's not as robust as Lightning, definitely more finicky. But it has more capability, which is important.
What I've read is that the Micro-USB plug is intentionally designed to fail before the connector inside the device is damaged.
I have a compulsion for fixing things, so I've seen a lot of gadgets where a connector has been broken away from a circuit board due to repetitive stress on a plug. The most common have been audio plugs -- headphone jacks in cellphones, and some connectors in musical instrument gear. I'd much prefer to replace a $5 cable than an expensive phone or gadget.
But of course it's arguable that they made it too delicate.
Now that I'm on my soap box... I've also seen a lot of damaged cables where the breakage is in the wire just as it exits one of the plugs. And a common cause is the habit of coiling your cables neatly by wrapping them as tightly as possible. Since I mentioned musical gear, I'm a working musician, and I cringe when I see how people -- even engineers -- treat cables. I always advise people to watch one or two of the ubiquitous videos where some burly roadie shows the proper way of coiling and handling a cable. I'm a bassist, and I have cables that have lasted 20+ years.
How do people find Lightning cables robust? Every single one I got from Apple failed around the one year mark. So much so that I finally started buying cheap knockoffs that only lasted 6 months but cost a tenth of official ones. To compare, I haven't seen a single Micro-USB or USB-C cable fail on me whether expensive or cheap. Am I simply uniquely unlucky in the matters of Lightning cables?
I really have never had any issues with USB-C, lightning on the other hand was the complete opposite. Fascinating we have had the exact opposite experiences
Hmm. I don't see how. I'm poor so the quality of cables I can afford or buy is much worse than the average tech worker — I'm limited to either the cable that comes with e.g. my phone, or some 1.5m cables I bought from Amazon four years ago, and I've never had a flimsy or dodgy USB-C connection, even though those cables were put through hard work while I was homeless (and honestly I'm really, really surprised — they should be breaking by now).
I disagree, lightning is more fragile as it has a single point of contact which can bend, they also become unusable if the exposed contacts get damaged or corroded.
Apart from that though it was proprietary, which is awful for lots of reasons; that’s the main reason I’m happy to see it gone.
Lightning works great. It's a wonderful connector. Of all the Lightning-equipped devices I've ever owned (1), I've only ever had one single issue with it that required replacing a cable.
50% failure is an admirable and lofty bar that all electrical connectors should strive to meet.
Lightning is so awesome and universal that Apple has never even bothered fitting it to a pedestrian device like a computer, and has reserved it for only their most very-exclusive, high-tech devices (like the portable telephones and mice that were once available at astutely prestigious retail locations such as Wal-Mart).
Seriously, this Lightning connector is like the best Kool Aid ever. It's a shame that they stopped making it; it could have been everywhere, if only it had more time in a truly free market.
12 glorious years was clearly not enough time. It deserved so much more.
Except for how either Apple or the pinout forced it to be (excluding very rare situations) stuck at 480MB/s. USB-C can hit 20GB/s. Lightning also tops out at lower wattages.
And by the time you revise the pinout, you effectively have a different connector. Lightning was nice-ish to plug in, but the wear-component was on the expensive device, not the cheap cable, and pairing it with the shit data transfer rate makes it a terrible connector
Standard plug is great but government need to mandate labeling.
I'm stuck putting wire labels on every USB c cable I own. I can't tell the difference between a 3A and 5A cable otherwise, same for usb2.0 only cables vs 3.1 vs 3.2 4x,whatever the fuck.
I wouldn't be against better labeling, but I've found that I don't have to worry about it too much, day to day.
USB-C has allowed me to grab one decent two-port charging brick, two solid 6ft cables, and charge just about everything I own just by keeping those in my backpack. If I think I'll need to move any data fast, etc., I just throw my one good USB4 cable in my bag, too.
I will admit, though, that I've had some crappy situations at work where it turned out my flaky monitor setup was due to the stupid work-provided docks coming with cables that only supported 10Gbps. Better labeling would've solved those ones.
One of my pet peeves with USB C is that many laptop manufacturers went "great less space occupied we can push the porta closer together to make space for something else", but many USB C devices (particularly USB Sticks ...) have inherited the dimensions of USB A. So there is not enough space for a plug and cable, e.g. I can't use my yubi key while my monitor is connected to the laptop.
this is 100% Claude-generated,and without citations I'd be very careful at trusting it. wonder why whoever prompted this in existence would not include actual references and sources of information.
disclaimer: me -> everyday CC user, so trust me, this thing loves to spit nonsense.
Written by Claude, too. (well, "Grumbles", as the footnote says)
I don't particularly care if it's right or not but this is ...weird. Especially from Rands.
I can't parse what the idea is here, like, what's being communicated and why. The "minimal writing" version says too little, the "throw everything and the kitchen sink version" says too much. And enough of both is slop (meaning, unneeded) that it's hard to orient yourself and find a guidepost, if there is one.
And I love using AI, and my reading comprehension scores have never been below 99.9%. Idk why I'm even sharing that. It's just, it's not me, it's not some battle I'm fighting, it really is a real problem, not just "oh it's Claude", it's bad writing in an alien way from an author I've always loved.
EDIT: After my 11th minute and 4th read on this, it has become clear to me that the idea is, you don't want to use the cable that comes with your iPhone for general USB data transmission because it is slow. The noise in the short version is USB IF, 5gbps, MacBook Neo.
Yes I gave up reading half way and came here for the discussion because the style of writing was so bad and it doesn’t really seem to have a point to make.
Not sure what value someone generating slop like this thinks they are adding but I think it’ll become a strong social stigma to generate articles and people will later be very embarrassed by all this slop.
USB-C is in fact completely fine in normal use, and cheap cables are about the only problem with it.
No need to overthink it. USB cables should just label themselves with their bandwidth - it's not rocket science. Lots of other kinds of cables have a similar requirement. And I guess their maximum watts too. Admittedly I'm not sure why so few USB cables do this.
I'd very much rather not have a new connector shape every time the technology improves and devices and cables gain new capabilities. The benefit of where USB-C is at, is the new stuff is backwards compatible with previous generations. The complaints in the early years - about one connector, unpredictable capabilities - were wrong. It took time for this benefit to accrue.
Also all the version numbers and brand names have been confusing, but the bandwidth is just a single number that goes up each generation and covers most of the issues now. There are just a few edge cases this doesn't cover these days.
Most USB C cables do have a label, but it's an electronic one. Desktop and mobile OSes could do a better job of surfacing this information for the user.
In this way, I would be able to see (using the advanced, integrated bionic vision system that I've carried with me and used every day I've been alive) what it is that I have before me instead of plugging them in one at a time to some electronic oracle to try to discern the details of the invisible magic inside.
>No need to overthink it. USB cables should just label themselves with their bandwidth - it's not rocket science.
And yet, this requirement already misses the other thing it should state: it's power rating. Because even two cables with the same bandwidth can have widely different power rating, and thus powering capacity or charging speed for different devices.
Don't take this comment too seriously, just a curiosity.
Powering capacity sometimes matters, but are there any devices out there where the charging speed would be meaningfully different? As in, they use significantly more than 60 watts to charge? (I looked up some of those super fast charging phones and they don't seem to be following the USB standards in the first place.)
In an alternate world, Ethernet took on the role of the universal serial bus, and we have laptops that charge via PoE, but only possible on one of their ports (the others are usable for peripherals --- with protocols running over Ethernet too, of course.) But the same confusion regarding power and speed capabilities exists.
We'd have to invent a new connector first. It's too thick for modern laptops, not to speak of cell phones.
Also, RJ45 is terribly fragile if you keep plugging and unplugging it, eventually that latch will break. And copper can barely support 10G and is terribly power hungry when it does that. And the cables get thick and inflexible.
The 8 pin modular connector as found in most ethernet does have several sins but it has one huge redeeming feature, A feature I wish was found in every cable. It is easy to field terminate. Have fun putting a new end on nearly any other cable.
Lenovo has re-invented this particular wheel to fit in laptops, some ThinkPads come with a proprietary Ethernet port which is around the size of USB-C, just with Ethernet signals. And you can get a passive breakout adapter to convert it to RJ45 (idk if it's included with the laptop).
Nah, there's enough space for an RJ45 connector on the 0.48" thick E7270, so there's certainly enough space for one on the 0.61" Macbook Pro 14. The trick is putting the connector on the display hinge.
Laptops no longer come with ethernet ports because (a) wifi is good enough for most people, most of the time; (b) apple went USB-C-only in ~2018 and other 'premium laptops' copied it; and (c) by the time that trend reversed and laptops started re-adding hdmi and usb a ports, demand for ethernet connectors was lower than ever.
> copper can barely support 10G and is terribly power hungry when it does that.
AFAIK, thunderbolt cables are also copper - so what trickery do they use for supporting USB4-80? i believe both connectors use differential pair wires for signalling.
Even though both USB and Ethernet transport bits, the surrounding ecosystem is so different that it couldn't really be a replacement.
Devices plugged into an Ethernet network are true peers, but USB is master-slave by necessity. Ethernet devices have unique addresses, but USB devices can be anonymous, only identified based on the port they're plugged into. Ethernet is best-effort with buffering and packet dropping, but USB provides guaranteed delivery with tightly bounded latency. Ethernet signals must travel up to 100 meters but USB requires the host and device to be within a few meters. You could reuse the physical wires, maybe (we already do! USB runs on twisted-pair) but nothing else, from the connector to the topology, is usable.
I modded a laptop to charge over PoE in 2007. Before realizing that the places that had PoE, and the places I wanted to charge my laptop, had nearly zero overlap. It was virtually useless in practice, but I still love the idea.
I have not yet made a laptop to output PoE. Though it would be tremendously useful for provisioning IP cameras, there are dedicated thick-tablet-shaped devices for that, which do source PoE from their batteries.
Ethernet is different in part because it's for much longer cable runs (both average and maximum) than usb is. It's a lot easier to maintain signal integrity for 10 gbps or 40 gbps when you're dealing with a few meters maximum (0.8m max I guess for usb4?).
There are DAC cables for 10G and 40G (actually 4x10G) Ethernet which are only a few meters. The newer USB standards' physical layer and other characteristics actually resembles Ethernet in many ways.
People would still complain that you can pickup the wrong cable and it won't work for 10GbE and that the ports look the same but some work on 10Mbit and others on 2.5GbE!
Some can even give and receive power and look the same as others that can't!
I wish they would have developed a data-over-powerline protocol specifically to link your monitor's USB to your PCs. Especially for computers 20+ years ago, almost everyone plugged their monitor into the same AC power jack as the PC. They could have had a USB 1.x connection between the monitor -> AC power -> power supply -> motherboard so you could plug your slower USB devices into your monitor and skip the USB cable. Apple had this, I guess, with imac, and I know about powerline ethernet devices, but I think they skipped the most obvious use case.
That's because anything-over-powerline is absolutely and objectively terrible.
You already have a high throughput data cable between your PC and monitor. Carry USB over displayport or whatever. At least then you can use more than one PC on an entire city block.
"The plug on this device represents the latest thinking of the electrical industry's Plug Mutation Group, which, in a continuing effort to prevent consumers from causing hazardous electrical current to flow through their appliances, developed the Three-Pronged Plug, then the Plug Where One Prong is Bigger Than the Other. Your device is equiped with the revolutionary new Plug Whose Prongs Consist of Six Small Religious Figurines Made of Chocolate. DO NOT TRY TO PLUG IT IN! Lay it gently on the floor near an outlet, but out of direct sunlight, and clean it weekly with a damp handkerchief."
It would help if computers / phones had an easy way to just identify a cable when you plug it in. Is this hard to do or just something normal people never care about?
The cable can report what it "thinks" it is, and in fact, modern USB-C cables do this: they have "e-Marker chips" inside the plugs which communicate with whatever they're plugged into and enumerate their belief as to their capabilities. The thing is, manufacturers can set the e-Marker chips to spew lies, or a cable that used to support 80Gbps got slightly damaged after 6 months of use and now only reliably transmits 10Gbps.
Power capacity is relatively easy to measure ad-hoc via voltage drop from one end to the other...USB-PD controllers already do this and can even fine-tune the voltage to make sure that if the device receiving (sinking) power needs 20V they'll send 20.4V or 20.9V to compensate for voltage drop so that the charging device gets 20V on its end.
But actual maximum data throughput is hard to know. The only way to really "know" how much data can flow through a cable is with an expensive oscilloscope or cable tester. Because 80Gbps cables run at ~13GHz so, at minimum you need a 26GHz scope (Nyquist–Shannon sampling theorem) or more practically a 52GHz scope. And it turns out it's really expensive to measure electrical signals 52 billion times per second. The necessary devices start at $15,000 (cable signal integrity tester) [0] on the very low end and only work for max 10Gbps USB 3.2 cables, or past $270,000 for 80Gbps USB4 cables (proper 60GHz oscilloscope) [1].
On the high end, each signal integrity test device can actually cost $1-2 million [2] where the base unit starts at $670,000 plus then spending additional money for hardware-accelerated analysis, specialized active probes, and the specific PAM-3 / USB4 compliance software packages.
This is overthinking it a bit. You mostly only need that stuff to tell you why it isn't working. If you want to know if it's up to the job, you can just measure the error rate, which just means sending a lot of data across and counting the errors. There might be some faults which only occur when the cable is in a particular position, but you can at least detect it when it happens.
The interface IC almost certainly also estimates signal quality, but it's likely hard to get that information out of it.
> modern USB-C cables [...] have "e-Marker chips" inside the plugs
If only they all did. I have a significant percentage in my pile with no e-Marker chip. They'll be the first to be culled once I get around to that, mind.
I get that to properly test a cable, you need that level of accuracy, but for home use, couldn’t you get away with a source and a receiver that are far cheaper?
If a USB4 device can output a USB4 stream and the receiver can check that stream for errors, isn’t that sufficient?
Probably can't tell you anything about the other end of the cable though.
> Is this hard to do or just something normal people never care about?
If i believed in conspiracies i'd say the usb consortium or mafia or whatever it's called is pressuring software developers to not display that info. Otherwise they'd have "normal people" with torches and pitchforks at their door.
Windows will throw up warnings when the disk space is nearly empty, when it detects driver instability, when RAM is full and page files can't keep up, when a specific application is draining your battery, when your files aren't backing up right, and all other kinds.
The problem with most of those is that either users don't care until it's too late ("I need to get this done now, I'll delete files later"), third party applications are the cause and Windows can't/shouldn't interfere (did a program memory leak or is the user pushing the boundaries of what the system can handle?), or because there's not much the user can do about it ("your GPU driver crashed", well gee, my drivers are up to date, let me spend half a month's wages on a new GPU then, shall we?).
The only "too late" errors I've seen on Windows are when something very important has crashed and the system needs to shut down for data integrity (crss.exe crashing on school computers comes to mind, though I doubt that was the fault of Microsoft), or when something unpredictable went wrong, like a file ending up corrupt because of a failing hard drive or flipped bit in memory.
Microsoft actually created a dedicated screen to monitor errors and failures of all kinds (https://www.elevenforum.com/t/view-reliability-history-in-wi...) that's been around since Vista. It used to open up automatically if you clicked a popup after certain errors, but it appears Microsoft eventually stopped doing that. Going by how many "today I learned" posts I find when I look up the feature, I'm guessing nobody who actually understands what the screen does ever used the feature.
They now have the option to silently add this kind of detail to logs and have clippy find answers to why is my computer odd/slow only when asked. For a long time I felt like companies leaving product decisions to the Occamist (or the closely related lazy programmer) was a superpower to compete against larger organizations that usually don't, but we may get a run for our money from emulated simplicity.
I get the frustration over standards for high speed and high power applications. I note this:
For many/most applications, 5V/1A power + 480Mbps USB 2.0 data is supported on every or almost every USB cable and device, and exceeds requirements. USB C being ubiquitous and capable of these makes it a the most consolidated/universal power + data standard I have experienced in my life. It's also a small connector that's easy to plug in.
There are exceptions: Charging your laptop or phone benefits from higher current. External drives or other mass data transfer benefits from high speed. I look at the electronics devices (Computer peripherals and otherwise), and most are fine with USB-C for power and data, not coming close to the limits on either.
And on top of that, Apple has that thing where only some devices can charge from their adapters. I have a special adapter just for non-Apple things because the white bricks (despite the usb-c) sometimes just refuse to give power to things. So frustrating.
Mostly, that's non-compliant devices. Doesn't make it work any better, but I wouldn't assume Apple is doing it wrong here.
USB-C ports aren't allowed to provide power until after configuration, but a lot of USB-C chargers provide 5V regardless. This is wrong, but it does mean you can use a dumb C-to-micro cable which doesn't include the necessary electronics. (A pull-down resistor at least.)
And of course there's no way to tell by the looks of the cable.
Yeah this is right. I bought a cheap wireless mouse, with a USB-C port for charging. None of the USB-C chargers in my house would charge it, so after awhile it inevitably went flat and I took it back to the shop - since it was faulty.
The guy in the shop plugged it in to a USB-A port via a cheap A-to-C cable, and the mouse immediately came to life. Of course. I felt like an idiot.
I didn't get a faulty unit. Whoever designed the mouse was treating the USB-C plug like a newer micro-usb port. The mouse just expected 5V over the port. They clearly didn't bother testing it with a proper USB-C charger.
I returned it anyway and got a mouse that wasn't broken.
Not necessarily, Apple only implemented the latest and greatest USB charging spec in some of their devices (AVS). Their chargers speak the new protocols so their devices and their chargers will work, but a charger from a few years back can easily deliver 100W following the spec (PPS, other PD standards) but be unable to deliver high power charging on some Apple hardware.
Neither side is wrong per se, though it's quite annoying that Apple didn't implement PPS. Then again, if you're buying Apple, you should probably expect these kinds of shenannigans and be ready to need to buy dedicated peripherals.
Apple implements the USB-C/USB PD specs to a t and is unforgiving if you don't do either.
At work, our quick test for if a device implements USB PD correctly is to plug it into an Apple power supply (optionally with a PD protocol sniffer in line). If it doesn't work (either no/intermittent VBUS or the wrong VBUS), it's always been the case that the device is doing something wrong.
It can be annoying but strictly speaking their fault.
I've run into problems with Apple chargers not charging my Lenovo laptop. (I used to be an Apple fanboy, but after a MacBook Pro that required 6 repairs, I switched to Lenovo).
I've been much happier since switching to Anker chargers, works much better with my Lenovo and drastically more portable than the Apple ones. It's better able to fit certain situations where the Apple brick won't fit into sockets that are close to the ground / desk, at least not without a bulky extension cable.
A bit of snark, but don't forget the Apple charger recall:
(That said, I do think Apple's chargers were designed far better than most, and I loved that they put so much design thought into the world travel kit. Anker doesn't have the interchangeable heads, but it turns out their chargers are multi-region and a simple adapter head does the job just as well, in a smaller form factor than the Apple bricks. I still somewhat miss Magsafe as well, Magsafe 1 was excellent.)
It's even worse. The same USB-A-to-USB-C cable will either charge or not charge my iPhone, depending on where I plug in the USB-A part. But the port that won't charge my phone will happily charge my headset, using the very same cable. That kind of excludes the cable as the source of the suckage, and puts the blame on either the (supposed) power source or the phone. I've observed the same effect with other devices I wanted to charge, too. Some devices just won't accept certain USB power sources while others are more promiscuous.
USB-A gives 7.5W (1.5A at 5V) if advertised through BC1.2 or 2.5-4.5W otherwise, any protocols letting you draw more than that are either obsolete or proprietary.
Yup. I have a work laptop that is meant to charge via USB ... But only one of the two ports will charge ... They are right beside each other! An evil trick at the office is to move someone's USB cable from one port to the other.
Not exactly the same situation but some older MacBooks had an issue where you had to charge from one side of the laptop and not the other. Technically, the wrong side would charge it just fine but it would also make the computer quickly overheat and throttle until it was unusable, frozen or until it crashed.
A suitable USB cable for all features is ten times the price of a normal cable. That's why many smartphones come with USB-C cables and not actually rated Thunderbolt cables.
If the USB forum enforced their specifications, everyone would be complaining that their cables are now ten times the price, and people would still buy knock-off cables.
Same goes with chargers: I bought a 100W charger that stops delivering 100W after it overheats about half an hour into a session. I could spend twice as much on a charger that sustains the charge, but I probably wouldn't have bought that charger at all for that price.
USB-C would either be branded a bullshit expensive standard (like Apple's Thunderbolt cables are generally regarded) or an incomplete standard that gives manufacturers too much leeway.
I, for one, am quite happy that I can just buy a USB C charger now rather than spend 180 euros on an OEM replacement, even if I ocassionally need to throw a cable into the "garbage that came with an accessoire" bin.
Unfortunately, the USB label is trying to capture too many things and they really should've learned their lesson with USB 2.0 but they didn't.
So USB 1.1 was 12Mbps (theoretical). USB 2.0 as 480Mbps (theoretical)... kind of. It got complicated because a distinction was made between USB 2.0 Full Speed and USB 2.0 Hi Speed. "Full" Speed was just USB 1.1 (12Mbps). USB 2.0 Hi Pseed was the 480Mbps. I assume they didn't want to confuse consumers who might wonder if they can plug USB 1.1 and 2.0 together but they just created more confusion. Nikon famously started saying USB 2.0 for Full speed, as just one example.
So the version number is useless to consumers and should never be used.
This got a whole lot worse with USB 3.0+ because more capabilities got added to the standard but not all cables supported them so you could look at a cable and have no idea what it could do. Capabilities include:
- Data. This started at 5Gbps for SuperSpeed but has gone higher with subsequent versions.
- Power (max wattage varied)
- USB Alt Mode (DP, HDMI or TB over USB-C)
So how do you capture at least 5 capabilities of a cable? You can't make a cable do everything. That's prohibitively expensive and also massively limits cable length.
Whatever the case, saying things like "USB 3.2 Gen 2" was not the answer.
Afaik alt mode is a data stream, so as long as your cable is not gimped (e.g. charging only) and supports USB3 data streams at sufficient speeds it ought work?
Which just gives two properties to care about: data rate and power. I can’t remember a usb plug which didn’t have the space to add 2 numbers / 8 characters.
Having a standard plug is great, I hope we stick with it for decades and gradually the situation will improve as everyone gets used to the standard.
USB-C gets rid of all the stupid previous decisions on the physical connectors (orientation required but not obvious, fragile clips, too large, too small), the physical side of things is now set and hopefully all devices, chargers and outlets will now converge on usb-c.
Yes getting the right cable can make a difference but the situation is so much better than before, partly because phone manufacturers were forced by the EU to adopt one connector early one. I’m so glad Apple’s proprietary connector is gone.
> I’m so glad Apple’s proprietary connector is gone.
Apple made Lightning when the rest of the world was still mucking about with Micro-USB, which I would argue is just about the worst connector ever in common use. The only type of cable where I routinely kept a half dozen on hand because they failed so damn often.
I do like USB-C, but despite being superior (physically) on paper, it's not as robust as Lightning, definitely more finicky. But it has more capability, which is important.
What I've read is that the Micro-USB plug is intentionally designed to fail before the connector inside the device is damaged.
I have a compulsion for fixing things, so I've seen a lot of gadgets where a connector has been broken away from a circuit board due to repetitive stress on a plug. The most common have been audio plugs -- headphone jacks in cellphones, and some connectors in musical instrument gear. I'd much prefer to replace a $5 cable than an expensive phone or gadget.
But of course it's arguable that they made it too delicate.
Now that I'm on my soap box... I've also seen a lot of damaged cables where the breakage is in the wire just as it exits one of the plugs. And a common cause is the habit of coiling your cables neatly by wrapping them as tightly as possible. Since I mentioned musical gear, I'm a working musician, and I cringe when I see how people -- even engineers -- treat cables. I always advise people to watch one or two of the ubiquitous videos where some burly roadie shows the proper way of coiling and handling a cable. I'm a bassist, and I have cables that have lasted 20+ years.
1 reply →
I've found the opposite, Lightning cables routinely failed for me and I haven't had a USB-C cable fail yet, and I've been using them for 7+ years.
Not sure if it's the connector or the build quality, but want to throw in the opposite experience.
3 replies →
How do people find Lightning cables robust? Every single one I got from Apple failed around the one year mark. So much so that I finally started buying cheap knockoffs that only lasted 6 months but cost a tenth of official ones. To compare, I haven't seen a single Micro-USB or USB-C cable fail on me whether expensive or cheap. Am I simply uniquely unlucky in the matters of Lightning cables?
2 replies →
USB-C is much better than micro or mini, but still lacks the robustness of A. I would far rather have something a few mm bigger but tough.
Lightning remains a better physical connection. So many USB-C connections I have are flimsy as hell.
I really have never had any issues with USB-C, lightning on the other hand was the complete opposite. Fascinating we have had the exact opposite experiences
4 replies →
Hmm. I don't see how. I'm poor so the quality of cables I can afford or buy is much worse than the average tech worker — I'm limited to either the cable that comes with e.g. my phone, or some 1.5m cables I bought from Amazon four years ago, and I've never had a flimsy or dodgy USB-C connection, even though those cables were put through hard work while I was homeless (and honestly I'm really, really surprised — they should be breaking by now).
Now, HDMI, on the other hand... yeesh
I disagree, lightning is more fragile as it has a single point of contact which can bend, they also become unusable if the exposed contacts get damaged or corroded.
Apart from that though it was proprietary, which is awful for lots of reasons; that’s the main reason I’m happy to see it gone.
Can't say why, but in my personal experience USBC is far less likely to stop working due to lint in the socket, which is fixable but annoying.
Lightning works great. It's a wonderful connector. Of all the Lightning-equipped devices I've ever owned (1), I've only ever had one single issue with it that required replacing a cable.
50% failure is an admirable and lofty bar that all electrical connectors should strive to meet.
Lightning is so awesome and universal that Apple has never even bothered fitting it to a pedestrian device like a computer, and has reserved it for only their most very-exclusive, high-tech devices (like the portable telephones and mice that were once available at astutely prestigious retail locations such as Wal-Mart).
Seriously, this Lightning connector is like the best Kool Aid ever. It's a shame that they stopped making it; it could have been everywhere, if only it had more time in a truly free market.
12 glorious years was clearly not enough time. It deserved so much more.
1 reply →
Except for how either Apple or the pinout forced it to be (excluding very rare situations) stuck at 480MB/s. USB-C can hit 20GB/s. Lightning also tops out at lower wattages.
And by the time you revise the pinout, you effectively have a different connector. Lightning was nice-ish to plug in, but the wear-component was on the expensive device, not the cheap cable, and pairing it with the shit data transfer rate makes it a terrible connector
Standard plug is great but government need to mandate labeling.
I'm stuck putting wire labels on every USB c cable I own. I can't tell the difference between a 3A and 5A cable otherwise, same for usb2.0 only cables vs 3.1 vs 3.2 4x,whatever the fuck.
I wouldn't be against better labeling, but I've found that I don't have to worry about it too much, day to day.
USB-C has allowed me to grab one decent two-port charging brick, two solid 6ft cables, and charge just about everything I own just by keeping those in my backpack. If I think I'll need to move any data fast, etc., I just throw my one good USB4 cable in my bag, too.
I will admit, though, that I've had some crappy situations at work where it turned out my flaky monitor setup was due to the stupid work-provided docks coming with cables that only supported 10Gbps. Better labeling would've solved those ones.
2 replies →
Yeah, every cable should have a 3 digit number of something with a unique capacity lookup.
1 reply →
You can just throw away the low-spec cables BTW.
One of my pet peeves with USB C is that many laptop manufacturers went "great less space occupied we can push the porta closer together to make space for something else", but many USB C devices (particularly USB Sticks ...) have inherited the dimensions of USB A. So there is not enough space for a plug and cable, e.g. I can't use my yubi key while my monitor is connected to the laptop.
A short USB-C extension cable won't do the trick?
Ref: https://www.amazon.com/Extension-Extender-0-65ft-Thunderbolt...
Those are out of spec though so should be used with caution.
It would, but that's the Apple solution, a dongle.
Better to design it right the first time, which I think is the OP's point.
https://randsinrepose.com/guides/usb/usb-guide.html
this is 100% Claude-generated,and without citations I'd be very careful at trusting it. wonder why whoever prompted this in existence would not include actual references and sources of information.
disclaimer: me -> everyday CC user, so trust me, this thing loves to spit nonsense.
https://randsinrepose.com/guides/usb/sources.html — every single fact is sourced.
Written by Claude, too. (well, "Grumbles", as the footnote says)
I don't particularly care if it's right or not but this is ...weird. Especially from Rands.
I can't parse what the idea is here, like, what's being communicated and why. The "minimal writing" version says too little, the "throw everything and the kitchen sink version" says too much. And enough of both is slop (meaning, unneeded) that it's hard to orient yourself and find a guidepost, if there is one.
And I love using AI, and my reading comprehension scores have never been below 99.9%. Idk why I'm even sharing that. It's just, it's not me, it's not some battle I'm fighting, it really is a real problem, not just "oh it's Claude", it's bad writing in an alien way from an author I've always loved.
EDIT: After my 11th minute and 4th read on this, it has become clear to me that the idea is, you don't want to use the cable that comes with your iPhone for general USB data transmission because it is slow. The noise in the short version is USB IF, 5gbps, MacBook Neo.
4 replies →
Yes I gave up reading half way and came here for the discussion because the style of writing was so bad and it doesn’t really seem to have a point to make.
Not sure what value someone generating slop like this thinks they are adding but I think it’ll become a strong social stigma to generate articles and people will later be very embarrassed by all this slop.
USB-C is in fact completely fine in normal use, and cheap cables are about the only problem with it.
No need to overthink it. USB cables should just label themselves with their bandwidth - it's not rocket science. Lots of other kinds of cables have a similar requirement. And I guess their maximum watts too. Admittedly I'm not sure why so few USB cables do this.
I'd very much rather not have a new connector shape every time the technology improves and devices and cables gain new capabilities. The benefit of where USB-C is at, is the new stuff is backwards compatible with previous generations. The complaints in the early years - about one connector, unpredictable capabilities - were wrong. It took time for this benefit to accrue.
Also all the version numbers and brand names have been confusing, but the bandwidth is just a single number that goes up each generation and covers most of the issues now. There are just a few edge cases this doesn't cover these days.
Most USB C cables do have a label, but it's an electronic one. Desktop and mobile OSes could do a better job of surfacing this information for the user.
Or they could simply be labelled.
In this way, I would be able to see (using the advanced, integrated bionic vision system that I've carried with me and used every day I've been alive) what it is that I have before me instead of plugging them in one at a time to some electronic oracle to try to discern the details of the invisible magic inside.
>No need to overthink it. USB cables should just label themselves with their bandwidth - it's not rocket science.
And yet, this requirement already misses the other thing it should state: it's power rating. Because even two cables with the same bandwidth can have widely different power rating, and thus powering capacity or charging speed for different devices.
I have no respect for a man who can't label a cable with more than one figure.
Don't take this comment too seriously, just a curiosity.
Powering capacity sometimes matters, but are there any devices out there where the charging speed would be meaningfully different? As in, they use significantly more than 60 watts to charge? (I looked up some of those super fast charging phones and they don't seem to be following the USB standards in the first place.)
5 replies →
In an alternate world, Ethernet took on the role of the universal serial bus, and we have laptops that charge via PoE, but only possible on one of their ports (the others are usable for peripherals --- with protocols running over Ethernet too, of course.) But the same confusion regarding power and speed capabilities exists.
We'd have to invent a new connector first. It's too thick for modern laptops, not to speak of cell phones.
Also, RJ45 is terribly fragile if you keep plugging and unplugging it, eventually that latch will break. And copper can barely support 10G and is terribly power hungry when it does that. And the cables get thick and inflexible.
The 8 pin modular connector as found in most ethernet does have several sins but it has one huge redeeming feature, A feature I wish was found in every cable. It is easy to field terminate. Have fun putting a new end on nearly any other cable.
6 replies →
Lenovo has re-invented this particular wheel to fit in laptops, some ThinkPads come with a proprietary Ethernet port which is around the size of USB-C, just with Ethernet signals. And you can get a passive breakout adapter to convert it to RJ45 (idk if it's included with the laptop).
https://www.lenovo.com/us/en/p/accessories-and-software/cabl...
4 replies →
> It's too thick for modern laptops
Nah, there's enough space for an RJ45 connector on the 0.48" thick E7270, so there's certainly enough space for one on the 0.61" Macbook Pro 14. The trick is putting the connector on the display hinge.
Laptops no longer come with ethernet ports because (a) wifi is good enough for most people, most of the time; (b) apple went USB-C-only in ~2018 and other 'premium laptops' copied it; and (c) by the time that trend reversed and laptops started re-adding hdmi and usb a ports, demand for ethernet connectors was lower than ever.
> copper can barely support 10G and is terribly power hungry when it does that.
AFAIK, thunderbolt cables are also copper - so what trickery do they use for supporting USB4-80? i believe both connectors use differential pair wires for signalling.
1 reply →
The ix.industrial ethernet connector is a thing. I hate it, but it's a thing.
Even though both USB and Ethernet transport bits, the surrounding ecosystem is so different that it couldn't really be a replacement.
Devices plugged into an Ethernet network are true peers, but USB is master-slave by necessity. Ethernet devices have unique addresses, but USB devices can be anonymous, only identified based on the port they're plugged into. Ethernet is best-effort with buffering and packet dropping, but USB provides guaranteed delivery with tightly bounded latency. Ethernet signals must travel up to 100 meters but USB requires the host and device to be within a few meters. You could reuse the physical wires, maybe (we already do! USB runs on twisted-pair) but nothing else, from the connector to the topology, is usable.
I modded a laptop to charge over PoE in 2007. Before realizing that the places that had PoE, and the places I wanted to charge my laptop, had nearly zero overlap. It was virtually useless in practice, but I still love the idea.
I have not yet made a laptop to output PoE. Though it would be tremendously useful for provisioning IP cameras, there are dedicated thick-tablet-shaped devices for that, which do source PoE from their batteries.
Ethernet is different in part because it's for much longer cable runs (both average and maximum) than usb is. It's a lot easier to maintain signal integrity for 10 gbps or 40 gbps when you're dealing with a few meters maximum (0.8m max I guess for usb4?).
There are DAC cables for 10G and 40G (actually 4x10G) Ethernet which are only a few meters. The newer USB standards' physical layer and other characteristics actually resembles Ethernet in many ways.
People would still complain that you can pickup the wrong cable and it won't work for 10GbE and that the ports look the same but some work on 10Mbit and others on 2.5GbE!
Some can even give and receive power and look the same as others that can't!
I wish they would have developed a data-over-powerline protocol specifically to link your monitor's USB to your PCs. Especially for computers 20+ years ago, almost everyone plugged their monitor into the same AC power jack as the PC. They could have had a USB 1.x connection between the monitor -> AC power -> power supply -> motherboard so you could plug your slower USB devices into your monitor and skip the USB cable. Apple had this, I guess, with imac, and I know about powerline ethernet devices, but I think they skipped the most obvious use case.
That's because anything-over-powerline is absolutely and objectively terrible.
You already have a high throughput data cable between your PC and monitor. Carry USB over displayport or whatever. At least then you can use more than one PC on an entire city block.
Per Dave Barry
"The plug on this device represents the latest thinking of the electrical industry's Plug Mutation Group, which, in a continuing effort to prevent consumers from causing hazardous electrical current to flow through their appliances, developed the Three-Pronged Plug, then the Plug Where One Prong is Bigger Than the Other. Your device is equiped with the revolutionary new Plug Whose Prongs Consist of Six Small Religious Figurines Made of Chocolate. DO NOT TRY TO PLUG IT IN! Lay it gently on the floor near an outlet, but out of direct sunlight, and clean it weekly with a damp handkerchief."
It would help if computers / phones had an easy way to just identify a cable when you plug it in. Is this hard to do or just something normal people never care about?
I guess you need control over both cable endings. You can buy dedicated cable testers like https://treedix.com/products/treedix-usb-cable-tester-usb-c-...
I have enjoyed my Treedix - now almost every cable I have has coloured labels for what it supports and what ends it has (handy when you're in a rush.)
On the downside, it has highlighted what a cowboy industry manufacturing USB-C cables is.
The cable can report what it "thinks" it is, and in fact, modern USB-C cables do this: they have "e-Marker chips" inside the plugs which communicate with whatever they're plugged into and enumerate their belief as to their capabilities. The thing is, manufacturers can set the e-Marker chips to spew lies, or a cable that used to support 80Gbps got slightly damaged after 6 months of use and now only reliably transmits 10Gbps.
Power capacity is relatively easy to measure ad-hoc via voltage drop from one end to the other...USB-PD controllers already do this and can even fine-tune the voltage to make sure that if the device receiving (sinking) power needs 20V they'll send 20.4V or 20.9V to compensate for voltage drop so that the charging device gets 20V on its end.
But actual maximum data throughput is hard to know. The only way to really "know" how much data can flow through a cable is with an expensive oscilloscope or cable tester. Because 80Gbps cables run at ~13GHz so, at minimum you need a 26GHz scope (Nyquist–Shannon sampling theorem) or more practically a 52GHz scope. And it turns out it's really expensive to measure electrical signals 52 billion times per second. The necessary devices start at $15,000 (cable signal integrity tester) [0] on the very low end and only work for max 10Gbps USB 3.2 cables, or past $270,000 for 80Gbps USB4 cables (proper 60GHz oscilloscope) [1].
On the high end, each signal integrity test device can actually cost $1-2 million [2] where the base unit starts at $670,000 plus then spending additional money for hardware-accelerated analysis, specialized active probes, and the specific PAM-3 / USB4 compliance software packages.
0: https://www.totalphase.com/products/advanced-cable-tester-v2...
1: https://www.edn.com/12-bit-oscilloscope-operates-up-to-65-gh...
2: https://www.eevblog.com/forum/testgear/uxr1104a-infiniium-ux...
This is overthinking it a bit. You mostly only need that stuff to tell you why it isn't working. If you want to know if it's up to the job, you can just measure the error rate, which just means sending a lot of data across and counting the errors. There might be some faults which only occur when the cable is in a particular position, but you can at least detect it when it happens.
The interface IC almost certainly also estimates signal quality, but it's likely hard to get that information out of it.
> modern USB-C cables [...] have "e-Marker chips" inside the plugs
If only they all did. I have a significant percentage in my pile with no e-Marker chip. They'll be the first to be culled once I get around to that, mind.
I get that to properly test a cable, you need that level of accuracy, but for home use, couldn’t you get away with a source and a receiver that are far cheaper?
If a USB4 device can output a USB4 stream and the receiver can check that stream for errors, isn’t that sufficient?
9 replies →
Super helpful -- integrated this into the guide. Thank you.
> https://github.com/darrylmorley/whatcable
This was on show hn only yesterday.
Probably can't tell you anything about the other end of the cable though.
> Is this hard to do or just something normal people never care about?
If i believed in conspiracies i'd say the usb consortium or mafia or whatever it's called is pressuring software developers to not display that info. Otherwise they'd have "normal people" with torches and pitchforks at their door.
it violates every products person wish to be “simple”.
There’s a reason that Windows barely shows any errors until the system fully halts.
Windows will throw up warnings when the disk space is nearly empty, when it detects driver instability, when RAM is full and page files can't keep up, when a specific application is draining your battery, when your files aren't backing up right, and all other kinds.
The problem with most of those is that either users don't care until it's too late ("I need to get this done now, I'll delete files later"), third party applications are the cause and Windows can't/shouldn't interfere (did a program memory leak or is the user pushing the boundaries of what the system can handle?), or because there's not much the user can do about it ("your GPU driver crashed", well gee, my drivers are up to date, let me spend half a month's wages on a new GPU then, shall we?).
The only "too late" errors I've seen on Windows are when something very important has crashed and the system needs to shut down for data integrity (crss.exe crashing on school computers comes to mind, though I doubt that was the fault of Microsoft), or when something unpredictable went wrong, like a file ending up corrupt because of a failing hard drive or flipped bit in memory.
Microsoft actually created a dedicated screen to monitor errors and failures of all kinds (https://www.elevenforum.com/t/view-reliability-history-in-wi...) that's been around since Vista. It used to open up automatically if you clicked a popup after certain errors, but it appears Microsoft eventually stopped doing that. Going by how many "today I learned" posts I find when I look up the feature, I'm guessing nobody who actually understands what the screen does ever used the feature.
They now have the option to silently add this kind of detail to logs and have clippy find answers to why is my computer odd/slow only when asked. For a long time I felt like companies leaving product decisions to the Occamist (or the closely related lazy programmer) was a superpower to compete against larger organizations that usually don't, but we may get a run for our money from emulated simplicity.
This is the right idea.
“The lie”, “The Gap”, “The Trap”…
Ugh.
I get the frustration over standards for high speed and high power applications. I note this:
For many/most applications, 5V/1A power + 480Mbps USB 2.0 data is supported on every or almost every USB cable and device, and exceeds requirements. USB C being ubiquitous and capable of these makes it a the most consolidated/universal power + data standard I have experienced in my life. It's also a small connector that's easy to plug in.
There are exceptions: Charging your laptop or phone benefits from higher current. External drives or other mass data transfer benefits from high speed. I look at the electronics devices (Computer peripherals and otherwise), and most are fine with USB-C for power and data, not coming close to the limits on either.
> Charging your laptop or phone benefits from higher current.
And voltage. Mostly voltage.
And on top of that, Apple has that thing where only some devices can charge from their adapters. I have a special adapter just for non-Apple things because the white bricks (despite the usb-c) sometimes just refuse to give power to things. So frustrating.
Mostly, that's non-compliant devices. Doesn't make it work any better, but I wouldn't assume Apple is doing it wrong here.
USB-C ports aren't allowed to provide power until after configuration, but a lot of USB-C chargers provide 5V regardless. This is wrong, but it does mean you can use a dumb C-to-micro cable which doesn't include the necessary electronics. (A pull-down resistor at least.)
And of course there's no way to tell by the looks of the cable.
Yeah this is right. I bought a cheap wireless mouse, with a USB-C port for charging. None of the USB-C chargers in my house would charge it, so after awhile it inevitably went flat and I took it back to the shop - since it was faulty.
The guy in the shop plugged it in to a USB-A port via a cheap A-to-C cable, and the mouse immediately came to life. Of course. I felt like an idiot.
I didn't get a faulty unit. Whoever designed the mouse was treating the USB-C plug like a newer micro-usb port. The mouse just expected 5V over the port. They clearly didn't bother testing it with a proper USB-C charger.
I returned it anyway and got a mouse that wasn't broken.
3 replies →
> This is wrong
I understand the technical reasons behind it, but in this case - the actual expectation is to be able to use usb-c to charge other gadgets.
6 replies →
Not necessarily, Apple only implemented the latest and greatest USB charging spec in some of their devices (AVS). Their chargers speak the new protocols so their devices and their chargers will work, but a charger from a few years back can easily deliver 100W following the spec (PPS, other PD standards) but be unable to deliver high power charging on some Apple hardware.
Neither side is wrong per se, though it's quite annoying that Apple didn't implement PPS. Then again, if you're buying Apple, you should probably expect these kinds of shenannigans and be ready to need to buy dedicated peripherals.
Apple implements the USB-C/USB PD specs to a t and is unforgiving if you don't do either.
At work, our quick test for if a device implements USB PD correctly is to plug it into an Apple power supply (optionally with a PD protocol sniffer in line). If it doesn't work (either no/intermittent VBUS or the wrong VBUS), it's always been the case that the device is doing something wrong.
It can be annoying but strictly speaking their fault.
Whaaaaaaaaat?!
Apple, somewhat famously, build their power adapters incredibly well.
If they’re not charging something my default assumption will be: that thing doesn’t support PD.
https://youtu.be/SUlNKYI07SY?is=sJ2ICaXwxCsBJiXA
https://youtu.be/rwEh4jsVew0?is=NeRD7hAk-6KABAyc
I've run into problems with Apple chargers not charging my Lenovo laptop. (I used to be an Apple fanboy, but after a MacBook Pro that required 6 repairs, I switched to Lenovo).
I've been much happier since switching to Anker chargers, works much better with my Lenovo and drastically more portable than the Apple ones. It's better able to fit certain situations where the Apple brick won't fit into sockets that are close to the ground / desk, at least not without a bulky extension cable.
A bit of snark, but don't forget the Apple charger recall:
https://support.apple.com/ac-wallplug-adapter
(That said, I do think Apple's chargers were designed far better than most, and I loved that they put so much design thought into the world travel kit. Anker doesn't have the interchangeable heads, but it turns out their chargers are multi-region and a simple adapter head does the job just as well, in a smaller form factor than the Apple bricks. I still somewhat miss Magsafe as well, Magsafe 1 was excellent.)
Your blind trust in Apple is misplaced :)
It's even worse. The same USB-A-to-USB-C cable will either charge or not charge my iPhone, depending on where I plug in the USB-A part. But the port that won't charge my phone will happily charge my headset, using the very same cable. That kind of excludes the cable as the source of the suckage, and puts the blame on either the (supposed) power source or the phone. I've observed the same effect with other devices I wanted to charge, too. Some devices just won't accept certain USB power sources while others are more promiscuous.
USB-A gives 7.5W (1.5A at 5V) if advertised through BC1.2 or 2.5-4.5W otherwise, any protocols letting you draw more than that are either obsolete or proprietary.
Feels like the appropriate place to put this link: https://www.lttstore.com/products/ltt-truespec-cable-usb-typ...
IME, having the right cable is 100x more useful than having a fast one.
That said, the only weirdness I've experienced is a device that came with a USB C to A cable that would not take power from a C to C
https://news.ycombinator.com/item?id=47984833 ?
> The USB situation.
> The lie.
> The gap.
> The names.
> The age.
> The trap.
> The buy.
> The truth.
> The chain.
> The lunacy.
> The cheat sheet.
Fucking LLMs have literally ruined the word "the" for me.
This person is apparently writing a book! I hope they put more care into it than this.
They've always written at least a little bit like this.
It read okay to me?
Also, I encourage people not to change their writing style just to avoid patterns that AI likes to use. I'm going to continue my em dashes.
This article is generated slop, and the reason to avoid it is that it’s terrible writing.
The discussion here is much more interesting IMO.
It's not ruined, it's corrupted.
[dead]
Never forget Rands was in Jerkcity (now Bonequest) and had them retroactively replace his character with atandt:
https://web.archive.org/web/20170918052437/http://www.jerkci...
https://bonequest.com/715
I thought I recognized the name rands somewhere!
Wow he was totally replaced, how weird, here's another example and they totally changed the strip:
https://web.archive.org/web/20170918052444/http://www.jerkci... https://bonequest.com/712
@dang, can we please get a flag function for ai slop?
"The lie, the age, the gap, the trap, the names, the buy, the ..."
I really don't come to HN to read such a stuff and HN is full of it since months. Please let us flag it and filter it out.
Seconding this. The lack of downvote button makes burying low quality posts like this impossible.
Yup. I have a work laptop that is meant to charge via USB ... But only one of the two ports will charge ... They are right beside each other! An evil trick at the office is to move someone's USB cable from one port to the other.
Not exactly the same situation but some older MacBooks had an issue where you had to charge from one side of the laptop and not the other. Technically, the wrong side would charge it just fine but it would also make the computer quickly overheat and throttle until it was unusable, frozen or until it crashed.
Mine work at both but connecting dock to different port re-names every monitor output
Of course there's also the issue of whether your cable is suitable and your charger suitable too.
We appear to have taken a good idea and made it shit very quickly.
A suitable USB cable for all features is ten times the price of a normal cable. That's why many smartphones come with USB-C cables and not actually rated Thunderbolt cables.
If the USB forum enforced their specifications, everyone would be complaining that their cables are now ten times the price, and people would still buy knock-off cables.
Same goes with chargers: I bought a 100W charger that stops delivering 100W after it overheats about half an hour into a session. I could spend twice as much on a charger that sustains the charge, but I probably wouldn't have bought that charger at all for that price.
USB-C would either be branded a bullshit expensive standard (like Apple's Thunderbolt cables are generally regarded) or an incomplete standard that gives manufacturers too much leeway.
I, for one, am quite happy that I can just buy a USB C charger now rather than spend 180 euros on an OEM replacement, even if I ocassionally need to throw a cable into the "garbage that came with an accessoire" bin.
> made it shit very quickly
What? The USB mafia has been at it since usb 1.1 or at best 2.0...
Nothing is stopping you from buying those 100$+ USB4/Thunderbolt5 cables that can do everything all at once.
I mean, it's dumb to charge a phone with it, since you don't need 80Gbps capability, but it'll fit your requirement of not being confusing :P
thinkpad?
Just switch to a different brand then.
Are other brands any better?
no
1 reply →
What in the slop is this.
Unfortunately, the USB label is trying to capture too many things and they really should've learned their lesson with USB 2.0 but they didn't.
So USB 1.1 was 12Mbps (theoretical). USB 2.0 as 480Mbps (theoretical)... kind of. It got complicated because a distinction was made between USB 2.0 Full Speed and USB 2.0 Hi Speed. "Full" Speed was just USB 1.1 (12Mbps). USB 2.0 Hi Pseed was the 480Mbps. I assume they didn't want to confuse consumers who might wonder if they can plug USB 1.1 and 2.0 together but they just created more confusion. Nikon famously started saying USB 2.0 for Full speed, as just one example.
So the version number is useless to consumers and should never be used.
This got a whole lot worse with USB 3.0+ because more capabilities got added to the standard but not all cables supported them so you could look at a cable and have no idea what it could do. Capabilities include:
- Data. This started at 5Gbps for SuperSpeed but has gone higher with subsequent versions.
- Power (max wattage varied)
- USB Alt Mode (DP, HDMI or TB over USB-C)
So how do you capture at least 5 capabilities of a cable? You can't make a cable do everything. That's prohibitively expensive and also massively limits cable length.
Whatever the case, saying things like "USB 3.2 Gen 2" was not the answer.
Afaik alt mode is a data stream, so as long as your cable is not gimped (e.g. charging only) and supports USB3 data streams at sufficient speeds it ought work?
Which just gives two properties to care about: data rate and power. I can’t remember a usb plug which didn’t have the space to add 2 numbers / 8 characters.
Yet another interesting article wit grey text on white background, making it very hard for me to read.
I wish people would realize doing this can lockout people with some eye issues.
Fixed, thanks for the feedback.