Comment by manquer
7 hours ago
Workloads emerge with higher capacity not other way around. Lossless media, to virtual reality applications all scale better with more available bandwidth.
An average AAA game is 100-200GB today. That is not by accident, The best residential internet of 1Gbps dedicated it is still 30 minutes of download, for the average buyer it is still few hours easily.
A 2TB today game is a 5 hour download on 1 Gbps connection and days for median buyer. Game developers can not think of a 2TB game if storage capacity, I/O performance, and bandwidth all do not support it.
Hypothetically If I could ship a 200TB game I would probably pre-render most of the graphics at much higher resolutions/frame-rates than compute it poorly on the GPU on the fly.
More fundamentally, we would lean towards less compute on client and more computed assets driven approach for applications. A good example of that in tech world in the last decade is how we have switched to using docker/container layers from just distributing source files or built packages. the typical docker images in the corporate world exceed 1GB, the source files being actually shipped are probably less than 10Mb of that. We are trading size for better control, Pre built packages instead of source was the same trade-off in 90s.
Depending on what is more scarce you optimize for it. Single threaded and even multi-threaded compute growth has been slowing down. Consumer internet bandwidth has no such physics limit that processors do so it is not a bad idea to optimize for pre-computed assets delivery rather than rely on client side compute.
And even at 1Gbps when I had it, the game servers couldn’t keep up.
I'll assume by "game servers" you mean "video game binary and asset distribution servers that support game stores like Steam and Epic and others".
When I paid Comcast for 1.5Gbit/s down, Steam would saturate that downlink with most games. I now pay for service that's no less than 100mbit symmetric, but is almost always something like 300->600mbit. Steam can -obviously- saturate that. Amusingly, the Epic Games Store (EGS) client cannot. Why?
Well, as far as I can tell, the problem is that -unlike the Steam client- the EGS client single-threads its downloads and does a lot of CPU-heavy work as part of those downloads. Back when I was running Windows, EGS game downloads absolutely pegged one of my 32 logical CPUs and left a ton of download bandwidth unused. In contrast, Steam sets like eight or sixteen of my logical CPUs at roughly half utilization and absolutely saturates my download bandwidth. So, yeah... if you're talking about downloads from video games stores it might be that whatever client your video game store uses sucks shit.
OTOH, if you're talking about video game servers where people play games they've already installed with each other, unless those servers are squirting mods and other such custom resources at clients on initial connect, game servers usually need like hundreds of kbps at most. They're also often provisioned to trickle those distributed-on-initial-connect custom resources in an often-misguided attempt to not disturb the gameplay of currently-connected clients.
I am talking about console downloads