Comment by Xelbair

16 days ago

>These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

>We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.

they did absolutely zero benchmarking beforehand, just went with industry haresay, and decided to double it just in case.

Nowhere in that does it say “we did zero benchmarking and just went with hearsay”. Basing things on industry data is solid - looking at the steam hardware surveys if a good way to figure out the variety of hardware used without commissioning your own reports. Tech choices are no different.

Do you benchmark every single decision you make on every system on every project you work on? Do you check that redis operation is actually O(1) or do you rely on hearsay. Do you benchmark every single SQL query, every DTO, the overhead of the DI Framework, connection pooler, json serializer, log formatter? Do you ever rely on your own knowledge without verifying the assumptions? Of course you do - you’re human and we have to make some baseline assumptions, and sometimes they’re wrong.

They made a decision based on existing data. This isn't unreasonable as you are pretending, especially as PC hardware can be quite diverse.

You will be surprised what some people are playing games on. e.g. I know people that still use Windows 7 on a AMD BullDozer rig. Atypical for sure, but not unheard of.

  • i believe it. hell i'm in F500 companies and virtually all of them had some legacy XP / Server 2000 / ancient Solaris box in there.

    old stuff is common, and doubly so for a lot of the world, which ain't rich and ain't rockin new hardware

    • My PC now is 6 years old and I have no intention of upgrading it soon. My laptop is like 8 years old and it is fine for what I use it for. My monitors are like 10-12 years old (they are early 4k monitors) and they are still good enough. I am primarily using Linux now and the machine will probably last me to 2030 if not longer.

      Pretending that this is an outrageous decision when the data and the commonly assumed wisdom was that there were still a lot of people using HDDs.

      They've since rectified this particular issue and there seems to be more criticism of the company after fixing an issue.

>they did absolutely zero benchmarking beforehand, just went with industry haresay, a

https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence

It was a real issue in the past with hard drives and small media assets. It's still a real issue even with SSDs. HDD/SSD IOPS are still way slower than contiguous reads when you're dealing with a massive amount of files.

At the end of the day it requires testing which requires time at a time you don't have a lot of time.

  • This is not a good invokation of Chesterton's Fence.

    The Fence is a parable about understanding something that already exists before asking to remove it. If you cannot explain why it exists, you shouldn't ask to remove it.

    In this case, it wasn't something that already existed in their game. It was something that they read, then followed (without truly understanding whether it applied to their game), and upon re-testing some time later, realized it wasn't needed and caused detrimental side-effects. So it's not Chesterton's Fence.

    You could argue they followed a videogame industry practice to make a new product, which is reasonable. They just didn't question or test their assumptions that they were within the parameters of said industry practice.

    I don't think it's a terrible sin, mind you. We all take shortcuts sometimes.

  • It's not an issue with asynchronous filesystem IO. Again, async file IO should be the default for game engines. It doesn't take a genius to gather a list of assets to load and then wait for the whole list to finish rather than blocking on every tiny file.

    • There are two different things when talking about application behavior versus disk behavior.

      >wait for the whole list to finish rather than blocking on every tiny file.

      And this is the point. I can make a test that shows exactly what's going on here. Make a random file generator that generates 100,000 4k files. Now, write them on hard drive with other data and things going on at the same time. Now in another run of the program have it generate 100,000 4k files and put them in a zip.

      Now, read the set of 100k files from disk and at the same time read the 100k files in a zip....

      One finishes in less than a second and one takes anywhere from a few seconds to a few minutes depending on your disk speeds.

"Industry hearsay" in this case was probably Sony telling game devs how awesome the PS5's custom SSD was gonna be, and nobody bothered to check their claims.

  • What are you talking about?

    This has nothing to do with consoles, and only affects PC builds of the game

    • HD2 started as playstation exclusive, and was retargeted mid-development for simultaneous release.

      So the PS5's SSD architecture was what developers were familiar with when they tried to figure out what changes would be needed to make the game work on PC.

      8 replies →