Comment by fragmede
3 years ago
Most clients that OP deals with, anyway. If your code runs exclusively in a data center, like the kind I suspect Google has, then the situation is probably reversed.
3 years ago
Most clients that OP deals with, anyway. If your code runs exclusively in a data center, like the kind I suspect Google has, then the situation is probably reversed.
Consider the rising of mobile device. The devices that don't have a good internet is probably everywhere now.
It's no longer like 10 years ago. You either have good internet or don't have internet. The devices that have shitty network grow a lot compare to the past.
Almost every application I've written atop a TCP socket batches up writes into a buffer and then flushes out the buffer. I'd be curious to see how often this doesn't happen.
Are you replying to the correct people? I think I never mention how you should write a program. I only say that assume user have a good internet connection is a naive idea nowadays. (The gta 5 is the worst example in my opinion, lost of a few udp packets and your whole game exit to main menu. How the f**k the dev assume udp packets never lost?)
2 replies →
If you run all of your code in one datacenter, and it never talks to the outside world, sure. That is a fairly rare usage pattern for production systems at Google, though.
Just like anyone else, we have packet drops and congestion within our backbone. We like to tell ourselves that the above is less frequent in our network than the wider internet, but it still exists.
If your DC-DC links are regularly as noisy as shitty apartment WiFi routers competing for air time on a narrow band, fix your DC links.