Comment by radicality
12 hours ago
I don’t remember the exact issues, but I remember seeing years ago my old Intel MacBook had noticeably higher cpu usage when connected to and using a Pluggable dock which had a Realtek Ethernet chipset. Switching to WiFi reduced cpu usage. AFAIK had something to do with bad and/or lack of hardware processing in the Realtek chipset so it had to do it on the cpu.
Now I never trust anything with Realtek in it, and if buying anything with an Ethernet port, I try to make sure it’s not Realtek. Is this still valid concern, or is Realtek better now?
I remember in the Intel days, the Apple Thunderbolt 1 GbE adapter would have high CPU usage when you were transferring at the full 1 Gbps.
I've had good luck with the Realtek 2.5 GbE adapters, no CPU usage issues.
And these days even with a 10 GbE Thunderbolt adapter the CPU use is negligible, so things have improved across the board I think.
I've used tons of Realtek stuff since the early 2000s and have had only one single device misbehave - the infamous RTL8139 Fast Ethernet which had many bad batches unleashed onto the world. I have both bad and good versions of this chip. It burned a lot of people back then, many of whom to this day stubbornly refuse to grow up from their trauma, and keep saying that everything Realtek is bad and can never be trusted.
It’s actually kinda funny when people say they’d only use Intel NICs (because of their good experience with e1000e), but then you look at Intel’s NGBASE-T (2.5/5 Gbit/s) trash fire or the X710 issues and they’ve just not been good for post-gigabit consumer-ish stuff. Granted, maybe the 19th stepping of i225 finally fixed something, I dunno.