Comment by tuetuopay

9 hours ago

Networking only uses bits/s. Nobody in the networking world talks in bytes/s, and pretty much nobody in the data transfer world does.

The only industry that talks in bytes/s is parts of the storage space, because they relate to files, that are measured in bytes/s. And even them use both: the data link is in bits/s (e.g. SATA 6 is 6Gbps, NVMe uses the same bits/s than PCIe (1)) while the drive is usually in bytes/s (µSD cards, NVMe SSDs, etc).

When you look at the industry at large, throughput is virtually always measured in bits/s. HDMI is in bits/s. Video codecs measures bitrates in bits/s. Audio codecs measures bitrates in bits/s. PCIe is in bits/s (1). Ethernet is measured in bits/s. Wifi is measured in bits/s. You get the picture.

The good thing about keeping it consistent is that values are relatable. Streaming services naturally talk in bitrate for the video quality, and your ISP also talks in bits/s. You can compare the two numbers. Bytes/s is only really useful for on the spot jobs that you do once, like transferring photos from an SD card to your computer. Otherwise, it's just a unit.

(1): ackhstually pcie measures speeds in transfers/s because they include the 8b10b/64b66b encoding overhead and TLP overhead but I digress.