Comment by ffsm8
16 hours ago
The large file sizes are not because of bloat per-se...
It's a technique which supposedly helped at one point in time to reduce loading times, helldiver's being the most note-able example of removing this "optimization".
However, this is by design - specifically as an optimization. Can't really be calling that boat in the parents context of inefficient resource usage
This was the the reason in Helldivers, other games have different reasons - like uncompressed audio (which IIRC was the reason for the CoD-install-size drama a couple of years back) - the underlying reason is always the same though, the dev team not caring about asset size (or more likely: they would like to take care of it but are drowned in higher priority tasks).
We aren't talking about the initial downloads though. We are talking about updates. I am like 80% sure you should be able to send what changed without sending the whole game as if you were downloading it for the first time.
Helldiver's engine does have that capability, where bundle patches only include modified files and markers for deleted files. However, the problem with that, and likely the reason Arrowhead doesn't use it, is the lack of a process on the target device to stitch them together. Instead, patch files just sit next to the original file. So the trade-off for smaller downloads is a continuously increasing size on disk.
from my understanding of the technique youre wrong despite being 80% sure ;)
any changes to the code or textures will need the same preprocessing done. large patch size is basically 1% of changes + 99% all the preprocessed data for this optimization
How about incorporating postprocessing into the update procedure instead of preprocessing?
Generally "small patches" and "well-compressed assets" are on either end of a trade-off spectrum.
More compression means large change amplification and less delta-friendly changes.
More delta-friendly asset storage means storing assets in smaller units with less compression potential.
In theory, you could have the devs ship unpacked assets, then make the Steam client be responsible for packing after install, unpacking pre-patch, and then repacking game assets post-patch, but this basically gets you the worst of all worlds in terms of actual wall clock time to patch, and it'd be heavily constraining for developers.
Interesting, today I learned!
Do you have some resource for people outside this field to understand what it's about?
It goes all the way back to tapes, was still important for CDs, and still thought relevant for HDDs.
Basically you can get much better read performance if you can read everything sequentially and you want to avoid random access at all costs. So you can basically "hydrate" the loading patterns for each state, storing the bytes in order as they're loaded from the game. The only point it makes things slower is once, on download/install.
Of course the whole excercise is pointless if the game is installed to an HDD only because of its bigger size and would otherwise be on an nvme ssd... And with still affordable 2TB nvme drives it doesn't make as much sense anymore.
So this basically leads to duplicating data for each state it's needed in? If that's the case I wonder why this isn't solvable by compressing the update download data (potentially with the knowledge of the data already installed, in case the update really only reshuffles it around)
It's also a valid consideration in the context of streaming games -- making sure that all resources for the first scene/chapter are downloaded first allows the player to begin playing while the rest of the resources are still downloading.
1 reply →