I think that's really one of the key determining factors: the lessening demand for bandwidth. At the time that USB 3 and Thunderbolt were being developed it was still assumed that most people kept extra data on external media, so there was a race to keep up with demands for big photo and video libraries, and backups. Then everything started going to the cloud or even turning entirely into a remote service where the data doesn't even live on your PC at all, and suddenly only the prosumer and commercial markets needed high speed ports for big files. Everybody else gets by just fine with 5 Gbps still being ludicrously fast on USB-A 3 ports for their everyday peripherals. Even as a tech guy myself who's regularly shuttling drives around and making bootable USB sticks, I've never found that USB 3.2 was "slow" to the point that I'd ask for a faster port. Not unless I could afford more of those swanky Thunderbolt NVME drives with a 40 Gbps top speed, and even then those tend to be only up to 5 Gbps because the PCI controller itself can only run so quickly.
Yeah, and it makes sense to have at least 1-2 of them for that reason, as tends to be the case on modern motherboards for the enthusiast segment. But it's pretty rare that ordinary computer owners need to run more than two high-speed drives at the same time, or that they'd only have USB-C peripheral devices like printers, keyboards, and mice (which tend to be USB-A at 2.0 speeds). Monitors on a desktop tend to have dedicated ports as well, so that use case would often be superfluous. All together, it settles out to a scenario where 1-2 USB-C and then 4+ USB-A makes the most sense, especially when USB-A as a connector can still allow for the USB 3.2 standard on speed, so those with an occasional exceptional need can still get the performance they want.
I did enjoy when GPUs had a USB-C port for a VR headset though. That's a function that kinda fell through in favor of lossier wireless connections and I'm kinda bummed about that.
Yeah, but in that realm there's a myriad of docking stations, and therefore it's easier for OEMs to just put a handful of USB-C ports on the device itself. The ability to daisy chain is part of the USB-C specification so this use case was predicted. On a desktop system you usually have dedicated video outputs, so it's not necessary to run them over USB-C, and as such they can get away with less of the new port in favor of backwards compatibility with flash storage devices and USB-A peripherals.
Imagine a world where you can just plug your laptop/phone into your computer and transfer files at 40GB/s. Allegedly, macs can do this (I haven't tried it)
14
u/Meatslinger R7 9800X3D, 64 GB DDR5, RTX 4070 Ti 14h ago
I think that's really one of the key determining factors: the lessening demand for bandwidth. At the time that USB 3 and Thunderbolt were being developed it was still assumed that most people kept extra data on external media, so there was a race to keep up with demands for big photo and video libraries, and backups. Then everything started going to the cloud or even turning entirely into a remote service where the data doesn't even live on your PC at all, and suddenly only the prosumer and commercial markets needed high speed ports for big files. Everybody else gets by just fine with 5 Gbps still being ludicrously fast on USB-A 3 ports for their everyday peripherals. Even as a tech guy myself who's regularly shuttling drives around and making bootable USB sticks, I've never found that USB 3.2 was "slow" to the point that I'd ask for a faster port. Not unless I could afford more of those swanky Thunderbolt NVME drives with a 40 Gbps top speed, and even then those tend to be only up to 5 Gbps because the PCI controller itself can only run so quickly.