It cannot be emphasized enough how much the recent USB specifications are dropping the ball. I wouldn't mind paying a small premium just for a guarantee that things will Just Work when I connect them.
Instead, USB device manufacturers want to cost optimize everything, which probably forced the standard to become so Byzantine. Understandable from a manufacturer's point of view, but terrible from an end user perspective.
I lived through the pre-USB world, and things were even more "fun".
In all seriousness though, the current state of a USB is ridiculous.
My wife innocently asked me why her USB C phone isn't triggering Android Auto in the car, and straight away you know it's because she bought a USB C power-only cable without knowing the difference.
Our second car, a VW, comes with its own branded cables for this very purpose. My guess is that they got so many customers calling them about problems that it made sense for them to throw it in.
They're pretty nice too, the only made in Germany USB-C/lightning cables I have ever used.
Same with Audi. Funnily enough, we got an Audi A4 a while ago and decided to include Android Auto for obvious reason. The car supports Android Auto only (no CarPlay), but came with a USB to Lightning cable. We were never asked which cable we want, so I assume that's a standard
TBH, all that's required for this label is that the final product is assembled at a plant in Germany. So what usually happens is "bulk ship from anywhere, have a plant for packaging the individual items in Germany."
Really? So people are supposed to plug their phones and laptop to public USB outlets with full USB cables? Given how many USB security vulns there are, I'm glad I can use a power-only cable when plugging to a third-party USB plug even if the USB standard think these cables should not exist…
Power-only cables really are not a thing for USB-C. You need at least the configuration channel in order to negotiate charging voltage and maximum current.
Maybe it's possible to make one lacking both USB 2 and 3 data though, but I haven't seen one yet.
Nor have I seen a public USB-C outlet, for that matter, and I probably wouldn't be plugging my laptop or phone into one anyway: My own charger doesn't only protect me against bad intent, but also against cheap charging circuits that might or might not accidentally expose 220 Volt to my laptop's mainboard in case of faults.
> Power-only cables really are not a thing for USB-C. You need at least the configuration channel in order to negotiate charging voltage and maximum current.
Ouch. USB-C is even more fucked up than I thought… How does that even works with USB wall chargers?
> Nor have I seen a public USB-C outlet, for that matter
There are plenty of USB-A outlet everywhere (airport, trains, hotels, etc.), and most recent Android phones have only an USB-C port…
> Ouch. USB-C is even more fucked up than I thought… How does that even works with USB wall chargers?
If they want to supply more than 5V/3A, they need to support the power delivery protocol too.
> There are plenty of USB-A outlet everywhere (airport, trains, hotels, etc.), and most recent Android phones have only an USB-C port…
Exactly: These are USB-A outlets. These are possible to implement using only a resistor network to announce the maximum charging current and always supply 5V, which is much easier to implement than variable voltage and the power delivery protocol. USB-C can do this too, but only up to 5V/3A.
USB-C devices are usually backwards compatible with all three of these when using an A-to-C cable or adapter: Legacy USB-A current identification via the D+/D- pins, USB-C resistor-based current identification via the configuration pins of USB-C and USB power delivery.
You need an active condom (has to man-in-the-middle-for-good the negotiation but pass no other data). I have one because a friend made a few; I've never seen any for sale.
They are, and frankly that they need to exist is the single most egregious example of why USB isn't what I need it to be. What (I'm surprised) I haven't seen is a male-to-female adapter with a switch on it that physically kills the data tx lines.
As far as I understand, that would make the cables near-unusable for actual charging as without the ability to negotiate power settings (over the data lines) they have to default to an extremely low charging rate. So instead of a physical switch it would have to be a smart device with logic that allows for the charging "handshake" but kills other data transfer.
It's just a choice. A smart charger can detect if a battery is present, or at least some load that is safe to dump power into.
You'll note that many chargers started dropping handshaking because it was inconvenient. To be compatible with anything you need it, but there's many that don't ask, though they may only work with the equipment they were sold with. (E.g. defaults to 5V 2A charging.)
It's not a question of chargers: It's a question of laptops in standby.
The historical design intent of USB is that, even when your laptop is in standby it still powers devices like your keyboard (so you can press keys to wake it up) and in exchange, devices promise to consume less than 2.5 milliamps (12.5 milliwatts) until they've negotiated permission to draw more from the host. After all, it'd suck if your laptop battery was going down noticeably even when the laptop was in standby.
Of course, loads of vendors of cheap devices ignore this - the makers of a $5 rechargeable bike light or USB fan aren't going to put in the components needed to negotiate charging speed. But in principle if you made a USB cable with only the power pins connected, compliant devices should only charge exceptionally slowly, if at all.
(Wall chargers, instead of enumerating as a USB host to negotiate power, used to put a certain value of resistor across the data lines, to signal what current the device could draw - this hasn't always been standardised, which is why phones and USB chargers can be incompatible)
Sounds like historical revisionism. Power limiting is possible without negotiation. It's easiest, for me, to find references to the safety aspect more than anything else.
Note also a fair number of laptops (read: every one I have used and checked) use PS/2 internally because it is interrupt driven and even lower power than USB. There are also plenty of laptops that advertise high current phone charging while off, some of which do negotiation, some of which don't.
Yeah, USB is seldom inside laptops, but you might be interested to know that it is becoming increasingly common to see HID over IC2 [0] used inside of laptops instead of PS/2.
I've definitely seen it. Reusing HID is kind of clever, but the protocol is extremely bloated and hard to parse.
Ignoring that, I'm not sure it's an improvement. Most of the power savings is having the keyboard initiate an interruptible event to wake the computer. Interpreting HID like in USB means the keyboard can't wake the computer, the computer needs to remain on to service USB interrupt endpoints.
> A smart charger can detect if a battery is present, or at least some load that is safe to dump power into.
> defaults to 5V 2A charging
i don't think you understand how power transfer works.
voltage is applied (by the charger), and current is drawn (by the device that wants to charge). a charger cannot "dump power into" anything.
when chargers and devices attached to them engage in some sort of negotiation, that's not the device telling the charger what to do, that's the charger telling the device what its limitations are.
if you attempt to draw 3A from a device that can only do 2A, the voltage will drop outside of the specified range. to the extent that a charger can limit the charging current, it does so by dropping/cutting off the voltage until the current goes down. which isn't ideal for devices.
(perhaps you're confused by thinking that USB "chargers" are like battery chargers, and arbitrary USB devices somehow act like batteries. that's all wildly wrong.)
I'm aware. The potential "dumps" current into the load proportional to its voltage. (A constant current supply would raise the voltage to achieve the requested current but the USB bus is not a constant current supply. There are multiple ways your yes-but is annoying and unhelpful.)
The main concern behind negotiation seems to be safety of the user and safety of the power supply. Detecting a safe load means detecting a not-short.
The basis for the 5V power negotiation on USB is extremely silly. The power supply is already current limited, protecting you from shorts, and the supply voltage is ~5V, quite far from anything dangerous.
> The basis for the 5V power negotiation on USB is extremely silly.
This may be true for wall chargers but not devices capable of supplying power to peripherals while on battery power themselves.
For example, an external hard drive should not have to figure out that a large tablet can power it, while a phone won't, though trial and error, by attempting to spin up the disks multiple times.
True, but opt-in seems to do what users expect most of the time. Enough device expose their full power without negotiation that there are workarounds like sleeping a USB port instead of relying on power negotiation.
Devices are going to misbehave anyway, it's probably more important to default to a reasonable level of mostly works.
I've got a (USB2) power monitor that includes a chip (just an STM8L051F3 microcontroller) to manage power negotiation and keep data disconnected. So it negotiates power with host and device separately, without ever needing to connect the data lines from host to device directly. I expect there are or will be versions for USB 3.<whatever I've lost track>.
For USB 2, widely-available battery packs compatible with whatever current negotiation standards your devices require work just as well, and have the added benefit of storing power.
Less generally, these batteries can also break ground loops, which comes in handy when connecting a power supply to both a USB device and an attached analog audio device (e.g., iPhone and wired noise-canceling earbuds).
It would be more convenient. Given that this is supposedly a security feature - no. Nope. Not at all: what is the state of the data line(s), and how do you know that it matches whatever indicator the cable uses? (There's no way to tell, and you don't - if software is involved, securing stuff becomes HARD, and proving security even harder.)
OTOH, a USB Y-cable physically has no data lines connected on the extra USB-A male connector; it thus provides a far stronger guarantee that there is no data travelling across them. (Not foolproof, but safer than a software-controlled switch.)
That's my point. If I can't tell whether this charging cable is secretly an USB keyboard, isn't it better to explicitly tell my cellphone to treat it as charge-only instead of relying on physical solutions?
My use case is I don't necessarily trust one end of the connection to behave, so I want the ability to just give/take power and involve exactly 0 software. I understand that's not the usual consumer story, and the best argument I can come up with for the general case is it could make any USB compatible wall plug safe to use to charge up.
I'm talking about using the phone's software to block the data connection.
If you don't trust that cellphone won't transfer your private data via USB, I don't understand why you trust it not to do it via any wireless connection.
Spec-compliant USB-C cables are still required to have the USB2 data lines hooked up, even "power-only" cables, and Android Auto / Carplay don't need beyond USB2
>USB device manufacturers want to cost optimize everything, which probably forced the standard to become so Byzantine
As Intel loses its raw CPU performance advantage, it's new marketing materials promote "features" and "convenience". Practically speaking, USB is defined by Intel, and it is making a visible choice to allow itself a position in the market for Thunderbolt 4 by making some USB4 features optional so they can include them in TB 4, and adding other small minimum performance guarantees and hoping that is enough to sell TB 4 chips to high-end AMD motherboard makers, which in turn makes their own CPUs (with built-in thunderbolt 4) more price competitive. It is a sound business plan, but not in the best interest of consumers.
> It is a sound business plan, but not in the best interest of consumers.
The fate of a universal lingua franca connector is too important to be left to just one company. Why is it still being defined by Intel rather than a panel?
I blame a combination of poor enforcement of trademarks and a desire to let people support the latest specifications with the cheapest possible build cost.
The poor enforcement of trademarks being - that there should not be a plethora of things claiming to be USB which are not spec compliant, nor should people be using names like "USB 3.2 Gen 1x1" vs the marketing name "Superspeed USB".
You also see this poor enforcement with other organizations, such as DisplayPort vs DisplayLink.
This spills over to problems with cabling, such as how much wattage is supported for power delivery or what speed data transfer is available.
Oddly Intel is the one stepping up here with Thunderbolt 4, because it is now more of a certification than a specification.
You can't blame the lack of "enforcement" to be the cause of the ills created by bad standards. I'm fairly certain that 2 official certified USB-3 cables can brick devices even though the physical form factor is the same.
Relying on cables internals for function is just as stupid as can be.
I think USB-C would have been so much better if the CC system were saner - for example we could have had a shift uart like register based api:
- Host queries some property and each cable along the way appends their value to that message and passives it along.
- The endpoints either loopback if they're the new standard or pull down the pins if they're USB-2 only.
- It follows that if no message makes it back then we're limited to USB 2.0 and if a message does make it back, we can query/configure all elements along the path of connections. Right now this is impossible which makes it impossible to detect if there is a non 50v tolerant hubs between two hubs.
- The programming/hw implementation for PD and alternate mode would also be non-insane in this model. Currently PD requires stupidly complex state machines, op amps, resistor banks or some autonomous IC to pull off and there are so many screwups because it's such an obtuse standard.
- Supporting alternate modes, and providing more diagnostic information would be much easier.
There should have only been 2 passive cable types - 2.0+PD and USB 3 gen 1 - everything else should require a smart tag on the chip.
This so much. The current USB @&$#-up seems to be the result of "EEs try to expand line-level encoding to create an API."
This should all be solved at the negotiation layer, even if that needs to be made more complex, so that the remaining components can be simpler and reasonably-behaved.
Instead, we got something that allows each device to be a bit more electrically simple, at the cost of ballooning complexity for the ultimate use case.
USB-IF took their eye off the ball, and wrote a spec for manufacturers, without thinking about the consequences to consumers.
At some point, it's a value trade-off between {working product for use cases} and {+$2 on BoM}.
There are an added pair of pins/single wire in USB-C cables that allow the devices to a) detect the orientation of the plug and b) do extended power negotiation for the increased power capabilities of USB-C.
If you think USB is bad, try looking at all the various things that can be connected together with M.2 slots vs. the pairs of devices that will actually work together despite having the same slot. I have a degree in electrical engineering and after years I only just realized that there are actually PCIe M.2 storage devices that only support AHCI mode not NVMe.
> I have a degree in electrical engineering and after years I only just realized that there are actually PCIe M.2 storage devices that only support AHCI mode not NVMe.
To be fair, there were only a handful of those really early on when M.2 was just getting started and OS/BIOS support for NVMe wasn't universal yet. And hardly any of them were released as retail products; they were mostly OEM-only drives. All of those drives went out of production 3-4 years before the first host devices that require NVMe and can't work with AHCI started to show up: USB to NVMe bridge chips for external enclosures. So if you haven't encountered an AHCI M.2 SSD yet, you probably never will and knowing about them is just an obscure historical curiosity.
Consumers want one cable and connector for all devices regardless of the ridiculous notion that all devices should have similar enough electrical requirements that they could use one cable. The result is USB.
It's no accident that the USB group is called the "USBIF" -- "USB Implementor's Forum". This is different from, say, the C++ standards committee which is a combo of compiler & std library developers (i.e. implementors!), enthusiasts, and some who are best referred to as "users".
Not a criticism of the USBIF* just pointing out the motivations to underline your point.
* BTW I have plenty of criticism, this just happens not to be an example.
> It cannot be emphasized enough how much the recent USB specifications are dropping the ball.
This outcome was pretty obvious, it's literally in the name of the damn thing: UNIVERSAL Serial Bus. The only way to achive that is to lower yourself to the lower common denominator.
But that's the point, right? For a manufacturer "universal" means the lowest common denominator whereas for an end user "universal" means...well, universal.
Instead, USB device manufacturers want to cost optimize everything, which probably forced the standard to become so Byzantine. Understandable from a manufacturer's point of view, but terrible from an end user perspective.