← Back to context

Comment by snet0

24 days ago

> With their latest data measurements specific to the game, the developers have confirmed the small number of players (11% last week) using mechanical hard drives will witness mission load times increase by only a few seconds in worst cases. Additionally, the post reads, “the majority of the loading time in Helldivers 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time.”

It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!

I expect it's a story that'll never get told in enough detail to satisfy curiosity, but it certainly seems strange from the outside for this optimisation to be both possible and acceptable.

> It seems bizarre to me that they'd have accepted such a high cost

They’re not the ones bearing the cost. Customers are. And I’d wager very few check the hard disk requirements for a game before buying it. So the effect on their bottom line is negligible while the dev effort to fix it has a cost… so it remains unfixed until someone with pride in their work finally carves out the time to do it.

If they were on the hook for 150GB of cloud storage per player this would have been solved immediately.

  • The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

    That's why they did the performance analysis and referred to their telemetry before pushing the fix. The impact is minimal because their game is already spending an equivalent time doing other loading work, and the 5x I/O slowdown only affects 11% of players (perhaps less now that the game fits on a cheap consumer SSD).

    If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.

    • > The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

      Not what happened. They removed an optimization that in *some other games* ,that are not their game, gave 5x speed boost.

      And they are changing it now coz it turned out all of that was bogus, the speed boost wasn't as high for loading of data itself, and good part of the loading of the level wasn't even waiting for disk, but terrain generation.

      1 reply →

    • > If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.

      And others who wish one single game didn't waste 130GB of their disk space, it's fine to ignore their opinions?

      They used up a ton more disk space to apply an ill-advised optimization that didn't have much effect. I don't really understand why you'd consider that a positive thing.

      10 replies →

    • 23 GiB can be cached entirely in RAM on higher end gaming rigs these days. 154 GiB probably does not fit into many player's RAM when you still want something left for the OS and game. Reducing how much needs to be loaded from slow storage is itself an I/O speedup and HDDs are not that bad at seeking that you need to go to extreme lengths to avoid it entirely. The only place where such duplication to ensure linear reads may be warranted is optical media.

      15 replies →

    • According to the post, "the change in the file size will result in minimal changes to load times - seconds at most."

      It didn't help their game load noticeably faster. They just hadn't checked if the optimization actually helped.

      9 replies →

    • If this is a common issue in industry why don't game devs make a user visible slider to control dedup?

      I have friends who play one or two games and want them to load fast. Others have dozens and want storage space.

      3 replies →

    • > The problem they fixed is that they removed a common optimization to get 5x faster loading speeds on HDDs.

      Maybe, kinda, sorta, on some games, on some spinning rust hard disks, if you held your hand just right and the Moon was real close to the cusp.

      If you're still using spinning rust in a PC that you attempt to run modern software on, please drop me a message. I'll send you a tenner so you can buy yourself an SSD.

      3 replies →

    • If 5x faster refers to a difference of "a few seconds" as the article says, then perhaps 5x (relative improvement) is the wrong optimization metric versus "a few seconds" (absolute improvement).

    • I think we should remember the context here.

      They're using the outdated stingray engine and this engine is designed for the days of single or dual core computers with spinning disks. They developed their game with this target in mind.

      Mind you, spinning disks are not only a lot more rare today but also much faster than when Stingray 1.0 was released. Something like 3-4x faster.

      The game was never a loading hog and I imagine by the time they launched and realized how big this install would be, the technical debt was too much. The monetary cost of labor hours to undo this must have been significant, so they took the financial decision of "We'll keep getting away with it until we can't."

      The community finally got fed up. The steamdb chart keeps inching lower and lower and I think they finally got worried about permanently losing players that they conceded and did this hoping to get those players back and to avoid a further exodus.

      And lets say this game is now much worse on spinning disk. At the end of the day AH will choose profit. If they lose that 10% spinning disk people who wont tolerate the few seconds change, the game will please the other players, thus making sure its lives on.

      Lastly, this is how its delivered on console, many of which use spinning media. So its hard to see this as problematic. I'm guessing for console MS and Sony said no to a 150gb install so AG was invested in keeping it small. They were forced to redo the game for console without this extra data. The time and money there was worth it for them. For PC, there's no one to say no, so they did the cheapest thing they could until they no longer could.

      This is one of the downsides of open platforms. There's no 'parent' to yell at you, so you do what you want. Its the classic walled garden vs open bazaar type thing.

    • Eh? Hard drives for gaming and high-end workstations are thoroughly obsolete. SSDs are not optional when playing any triple-A game. It's kinda wild to see people still complaining about this.

  • It is a trade-off. The game was developed on a discontinued engine, the game has had numerous problems with balance, performance and generally there were IMO far more important bugs. Super Helldive difficulty wasn't available because of performance issues.

    I've racked up 700 hours in the game and the storage requirements I didn't care about.

  • > They’re not the ones bearing the cost.

    I'm not sure that's necessarily true... Customers have limited space for games; it's a lot easier to justify keeping a 23GB game around for occasional play than it is for a 154GB game, so they likely lost some small fraction of their playerbase they could have retained.

  • > I’d wager very few check the hard disk requirements

    I have to check. You're assumption is correct. I am one of very few.

    I don't know the numbers and I'm gonna check in a sec but I'm wondering whether the suppliers (publishers or whoever is pinning the price) haven't screwed up big time by driving prices and requirements without thinking about the potential customers that they are going to scare away terminally. Theoretically, I have to assume that their sales teams account for these potentials but I've seen so much dumb shit in practice over the past 10 years that I have serious doubts that most of these suits are worth anything at all, given that grown up working class kids--with up to 400+ hours overtime per year, 1.3 kids on average and approx. -0.5 books and news read per any unit of time--can come up with the same big tech, big media, economic and political agendas as have been in practice in both parts of the world for the better part of our lives--if you play "game master" for half a weekend where you become best friends with all the kiosks in your proximity.

    > the effect on their bottom line is negligible

    Is it, though? My bold, exaggerated assumption is that they would have had 10% more sales AND players.

    And the thing is, that at any point in time when I, and a few I know, had time and desire to play, we would have had to either clean up our drives or invest game price + sdd price for about 100 hours of fun over the course of months. We would have gladly licked blood but no industry promises can compensate for even more of our efforts than enough of us see and come up with at work. As a result, at least 5 buyers and players lost, and at work and elsewhere you hear, "yeah, I would, if I had some guys to play with" ...

    • I do not think the initial decision-making process was "hey, screw working-class people... let's have a 120GB install size on PC."

      My best recollection is that the PC install size was a lot more reasonable at launch. It just crept up over time as they added more content over the last ~2 years.

      Should they have addressed this problem sooner? Yes.

  • Gamers are quite vocal about such things, people end up hearing about it even if they don’t check directly.

    And this being primarily a live-service game drawing revenues from micro-transactions, especially a while after launch, and the fact that base console drives are still quite small to encourage an upgrade (does this change apply to consoles too?), there’s probably quite an incentive to make it easy for users to keep the game installed.

  • Studios store a lot of builds for a lot of different reasons. And generally speaking, in AAA I see PlayStation being the biggest pig so I would wager their PS builds are at least the same size if not larger. People knew and probably raised alarm bells that fell to the wayside because it's easier/cheaper to throw money at storage solutions than it is engineering.

    • I only skimmed through this; I have no real information on the particular game, but I think the console versions could be much smaller as less duplication is necessary when the hardware is uniform.

  • I mean its not really a cost to anyone. Bandwidth is paid for by Valve, games can be deleted locally, etc.

    • Taking up 500% of the space than is necessary is a cost to me. I pay for my storage, why would I want it wasted by developer apathy?

      I'm already disillusioned and basically done with these devs anyways. They've consistently gone the wrong direction over time. The game's golden age is far behind us, as far as I'm concerned.

      6 replies →

  • Which goes to show, that they don't care about the user, but only about the user's money.

    • No - because most users also don't check install size on games, and unlike renting overpriced storage from a cloud provider, users paid a fixed price for storage up front and aren't getting price gouged nearly as badly. So it's a trade that makes sense.

      Both entrants in the market are telling you that "install size isn't that important".

      If you asked the player base of this game whether they'd prefer a smaller size, or more content - the vast majority would vote content.

      If anything, I'd wager this decision was still driven by internal goals for the company, because producing a 154gb artifact and storing it for things like CI/CD are still quite expensive if you have a decent number of builds/engineers. Both in time and money.

      4 replies →

I started my career as a software performance engineer. We measured everything across different code implementations, multiple OS, hardware systems, and in various network configurations.

It was amazing how often people wanted to optimize stuff that wasn't a bottleneck in overall performance. Real bottlenecks were often easy to see when you measured and usually simple to fix.

But it was also tough work in the org. It was tedious, time-consuming, and involved a lot of experimental comp sci work. Plus, it was a cost center (teams had to give up some of their budget for perf engineering support) and even though we had racks and racks of gear for building and testing end-to-end systems, what most dev teams wanted from us was to give them all our scripts and measurement tools to "do it themselves" so they didn't have to give up the budget.

  • That sounds like fascinating work, but also kind of a case study in what a manager's role is to "clear the road" and handle the lion's share of that internal advocacy and politicking so that ICs don't have to deal with it.

  • It's because patting yourself on the back for getting 5x performance increase in microbenchmark feels good and looks good on yearly review.

    > But it was also tough work in the org. It was tedious, time-consuming, and involved a lot of experimental comp sci work. Plus, it was a cost center (teams had to give up some of their budget for perf engineering support) and even though we had racks and racks of gear for building and testing end-to-end systems, what most dev teams wanted from us was to give them all our scripts and measurement tools to "do it themselves" so they didn't have to give up the budget.

    Misaligned budgeting and goals is bane of good engineering. I've seen some absolutely stupid stuff like outsourcing hosting a simple site to us, because client would rather hire 3rd party to buy domain and put a simple site there (some advertising), than to deal with their own security guys and host it on their own infrastructure.

    "It's a cost center" "So is fucking HR, why you don't fire them ?" "Uh, I'll ignore that, pls just invoice anything you do to other teams" ... "Hey, they bought cloud solution that doesn't work/they can't figure it out, can you help them" "But we HAVE stuff doing that cheaper and easier, why they didn't come to us" "Oh they thought cloud will be cheaper and just work after 5 min setup"

  • In an online services company, a perf team can be net profitable rather than a "cost center." The one at my work routinely finds quantifiable savings that more than justify their cost.

    There will be huge mistakes occasionally, but mostly it is death by a thousand cuts -- it's easy to commit a 0.1% regression here or there, and there are hundreds of other engineers per performance engineer. Clawing back those 0.1% losses a couple times per week over a large deployed fleet is worthwhile.

11% still play HD2 with a spinning drive? I would've never guessed that. There's probably some vicious circle thing going on: because the install size is so big, people need to install it on their secondary, spinning drive...

  • Even though I have two SSDs in my main machine I still use a hard drive as an overflow for games that I judge are not SSD worthy.

    Because it's a recent 20TB HDD the read speeds approach 250MB/s and I've also specifically partitioned it at the beginning of the disk just for games so that it can sustain full transfer speeds without files falling into the slower tracks, the rest of the disk is then partitioned for media files that won't care much for the speed loss. It's honestly fine for the vast majority of games.

  • It is no surprise to me that people still have to use HDD for storage. SSD stopped getting bigger a decade plus ago.

    SSD sizes are still only equal to the HDD sizes available and common in 2010 (a couple TB~). SSD size increases (availability+price decreases) for consumers form factors have entirely stopped. There is no more progress for SSD because quad level cells are as far as the charge trap tech can be pushed and most people no longer own computers. They have tablets or phones or if they have a laptop it has 256GB of storage and everything is done in the cloud or with an octopus of (small) externals.

    • SSDs did not "stop getting bigger a decade plus ago." The largest SSD announced in 2015 was 16TB. You can get 128-256TB SSDs today.

      You can buy 16-32TB consumer SSDs on NewEgg today. Or 8TB in M.2 form factor. In 2015, the largest M.2 SSDs were like 1TB. That's merely a decade. At a decade "plus," SSDs were tiny as recently as 15 years ago.

      4 replies →

    • I bought 4x (1TB->4TB the storage for half the price after my SSD died after 5 years (thanks samsung), what you mean they 'stopped being bigger'?

      Sure, there is some limitation in format, can only shove so many chips on M.2, but you can get U.2 ones that are bigger than biggest HDD (tho price is pretty eye-watering)

      2 replies →

    • I read that SSDs don't actually guarantee to keep your data if powered off for an extended period of time, so I actually still do my backup on HDDs. Someone please correct me if this is wrong.

      3 replies →

I don't find it surprising at all. A ton of developers do optimizations based on vibes and very rarely check if they're actually getting a real benefit from it.

  • This is the moral behind "premature optimization is the root of all evil" - you could say preconceived just as easily.

    • > you could say preconceived just as easily

      Would have saved us from all the people who reject any sort of optimization work because for them it is always "too early" since some product team wanted their new feature in production yesterday, and users waiting 5 seconds for a page load isn't considered bad enough just yet.

      5 replies →

    • Counterpoint: data driven development often leads to optimizations like this not being made because they're not the ones who are affected, their customers are. And software market is weird this way - little barriers to entry, yet almost nothing is a commodity, so there's no competitive pressure to help here either.

    • Honestly looking over time I think that phrase did more bad than good.

      Yes, of course you shouldn't optimize before you get your critical path stable and benchmark which parts take too much.

      But many, many times it is used as excuse to delay optimisation so far that it is now hard to do because it would require to rewriting parts that "work just fine", or it is skipped because the slowness is just at tolerable level.

      I have a feeling just spending 10-20% more time on a piece of code to give it a glance whether it couldn't be more optimal would pay for itself very quickly compared to bigger rewrite months after code was written.

    I expect it's a story that'll never get told in 
    enough detail to satisfy curiosity, but it certainly 
    seems strange from the outside for this optimisation 
    to be both possible and acceptable.

From a technical perspective, the key thing to know is that the console install size for HD2 was always that small -- their build process assumed SSD on console so it didn't duplicate stuff.

154GB was the product of massive asset duplication, as opposed 23GB being the product of an optimization miracle. :)

How did it get so bad on PC?

Well, it wasn't always so crazy. I remember it being reasonable closer to launch (almost 2 years ago) and more like ~40-60GB. Since then, the devs have been busy. There has been a LOT of reworking and a lot of new content, and the PC install size grew gradually rather than suddenly.

This was probably impacted to some extent by the discontinued game engine they're using. Bitsquid/Stingray was discontinued partway through HD2 development and they continued on with it rather than restarting production entirely.

https://en.wikipedia.org/wiki/Bitsquid

>It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!

You should look at COD install sizes and almost weekly ridiculously huge "updates". 150gb for a first install is almost generous considering most AAA games.

Game companies these days barely optimize engine graphical performance before release never mind the package size or patching speed. They just stamp higher minimum system requirements on the package.

From a business perspective the disk footprint is only a high cost if it results in fewer sales, which I doubt it does to any significant degree. It is wasteful, but can see why optimization efforts would get focused elsewhere.

  • I think certain games dont even bother to optimize the install size so that you cant fit other games on the hard drive, I think COD games are regularly hundreds of gigs

    • Having a humongous game might be a competitive advantage in the era of live-service games.

      Users might be more hesitant to switch to another game if it means uninstalling yours and reinstalling is a big pain in the backside due to long download times.

      2 replies →

    • I've often seen people mention that one reason for games like Call of Duty being so enormous is optimising for performance over storage. You'd rather decompress textures/audio files at install-time rather than during run-time, because you download/install so infrequently.

      1 reply →

    • > I think COD games are regularly hundreds of gigs

      I looked up the size of the latest one, and Sony's investment in RAD Kraken seems to be paying dividends:

      Xbox: 214 GB

      PC: 162 GB

      PS5: 96 GB

      1 reply →

  • Also the cost is often offloaded to the "greedy" Valve... So there is less pressure to optimize their own CDN use.

    • Yeah, I don't think any of the stores charge developers in proportion to how much bandwidth they use. If that changed then the priorities could shift pretty quickly.

      Publishers do have to care somewhat on the Switch since Nintendo does charge them more for higher capacity physical carts, but a lot of the time they just sidestep that by only putting part (or none) of the game on the cart and requiring the player to download the rest.

      1 reply →

  • It might but they have no way of measuring it so they won't take it into account.

  • Given how many Steam games are bought but never even installed, it would seem not terribly worth optimizing for.

    On phone, I bet you see some more effort.

    • Both things are sort of true. Its not sales where size can hurt you but retention, which is why it tended to matter more on phones. When you need space on your device the apps are listed from largest to smallest.

      On both phones and PCs storage has just grown so its less of an issue. The one thing I have noticed is that Apple does its price windowing around memory so you pay an absurd amount for an extra 128 gb. The ultra competitive Chinese phone market crams high end phones with a ton of memory and battery. Si some popular Chinese phone games are huge compared to ones made for the iPhone.

I'd bet any amount of money a demo ran slow on one stakeholder's computer, who happened to have a mechanical hard drive, they attributed the slowness to the hard drive without a real investigation and optimizing for mechanical hard drive performance became standard practice. The demo may not have even been for this game, just a case of once bitten twice shy.

IIRC this has been the “done thing” forever. I’m not in game development, but I think I recall hearing about it in the Xbox 360 era. Conventional options are picked by default, benchmarks are needed to overturn that. Looking at my hard drive, massive game installations are still very much the industry standard…

I have heard that in many scenarios it is faster to load uncompressed assets directly rather than load+decompress. Load time is prioritized over hard drive space so you end up with the current situation.

  • You need very fast decompression for that to work these days when io speeds are so high, and decompression takes compute that is being used for game logic.

    Very fast decompression often means low compression or multicore. I have seen libjpgturbo vastly outperform raw disk reads though

  • There have been plenty of times where the opposite is true: Storing highly compressed data and decompressing it in RAM is much faster than loading uncompressed assets.

    Which is the primary problem: Computers are so varied and so ever changing that if you are optimizing without hard data from your target hardware, you aren't optimizing, you are just doing random shit.

    Add to that, game devs sometimes are just dumb. Titanfall 1 came with tens of gigabytes of uncompressed audio, for "performance", which is horse shit. Also turns out they might have been lying entirely. Titanfall 1 was made on the source engine, which does not support the OGG audio format their audio files were in. So they decompressed them at install time. They could have just used a different audio file format.

High cost to who though. We see the same thing when it comes to RAM and CPU usage, the developer is not the one paying for the hardware and many gamers have shown that they will spend money on hardware to play a game they want.

Sure they may loose some sales but I have never seen many numbers on how much it really impacted sales.

Also on the disk side, I can't say I have ever looked at how much space is required for a game before buying it. If I need to clear out some stuff I will. Especially with it not being uncommon for a game to be in the 100gb realm already.

That all being said, I am actually surprised by the 11% using mechanical hard drives. I figured that NVME would be a lower percentage and many are using SSD's... but I figured the percent with machines capable of running modern games in the first place with mechanical drives would be far lower.

I do wonder how long it will be until we see games just saying they are not compatible with mechanical drives.

  • That already happened :) Starfield claimed to not support HDDs and really ran bad with them. And I think I saw SSDs as requirement for a few other games now, in the requirement listings on steam.

    • > Starfield claimed to not support HDDs and really ran bad with them.

      To be fair, at launch Starfield had pretty shit loading times even with blazing fast SSDs, and the game has a lot of loading screens, so makes sense they'll nip that one in the bud and just say it's unsupported with the slower type of disks.

  • Latest Ratchet and Clank game relies heavily on ps5’s nvme drive. Its PC port states that SSD is required. And IIRC, the experience on mechanical drives is quite terrible to the unplayable level.

All of that takes time.abd you never have enough time.

At any given point if it wasn't vital to shipping and not immediately breaking, then it can be backburnered.

Messing with asset loading is probably a sure fire way to risk bugs and crashes - so I suspect this mostly was waiting on proving the change didn't break everything (and Helldiver's has had a lot of seemingly small changes break other random things).

The game is released on both PC and PS5, the latter of which was designed (and marketed) to take advantage of SSD speeds for streaming game content near real time.

The latest Ratchet and Clank, the poster child used in part to advertise the SSD speed advantage, suffers on traditional hard drives as well in the PC port. Returnal is in the same boat. Both were originally PS5 exclusives.

  • Noting in particular that the PS5's internal storage isn't just "an ssd", it's a gen 4 drive that can sequential-read at up to 5500 MB/s.

    By comparison a SATA III port caps out at 6Gbps (750 MB/s), and first generation NVMe drives ("gen 3") were limited to 3500 MB/s.

    • the speed is one thing but seek times are just orders of magnitude different too.

      SSD on SATA is still "not bad" for most games, but HDD can be awful if game does not do much sequential

      1 reply →

  • The HDD performance suffers very much during the portal loading sequences in Ratchet and Clank, but even the entry level SSD performs fine, with little visible difference compared to the PS5 one. It’s more about random access speed than pure throughput

  • I played Rift Apart from HDD and apart from extra loading time during looped animations it was fine. On the other hand Indiana Jones Great Circle was barely playable with popping-in textures and models everywhere.

Optimizing for disk space is very low on the priority list for pretty much every game, and this makes sense since its very low on the list of customer concerns relative to things like in-game performance, net code, tweaking game mechanics and balancing etc.

  • Apparently, in-game performance is not more important than pretty visuals. But that's based on hearsay / what I remember reading ages ago, I have no recent sources. The tl;dr was that apparently enough people are OK with a 30 fps game if the visuals are good.

    I believe this led to a huge wave of 'laziness' in game development, where framerate wasn't too high up in the list of requirements. And it ended up in some games where neither graphics fidelity or frame rate was a priority (one of the recent Pokemon games... which is really disappointing for one of the biggest multimedia franchises of all time).

    • That used to be the case, but this current generation the vast majority of games have a 60 fps performance mode. On PS5 at least, I can't speak about other consoles.

a one time cost of a big download is something customers have shown time and again that they're willing to bear. remember that software is optimized for ROI first and all other things second. Sometimes optimizing for ROI means "ship it today and let the first week of sales pay salaries while we fix it", sometimes ROI means picking between getting the file size down, getting that new feature out and fixing that game breaking edge case bug. Everything you do represents several things you choose not to do.

It’s the same sort of apathy/arrogance that made new Windows versions run like dogshit on old machines. Gates really should have had stock in PC makers. He sold enough of them.

I don’t think it’s always still the case but for more than ten years every release of OSX ran better on old hardware, not worse.

Some people think the problem was MS investing too eagerly into upgrading developer machines routinely, giving them a false sense of what “fast enough” looked like. But the public rhetoric was so dismissive that I find that pretty unlikely. They just didn’t care. Institutionally.

I’m not really into the idea of Helldivers in the first place but I’m not going to install a 150GB game this side of 2040. That’s just fucking stupid.

it's not cost to them. The cost is paid by consumers and platforms.

Also if goal was to improve things for small minority they could've just pawned it off to free DLC, like how some games do with 4k texture packs

  • It would be ironic if incidents like this made Valve start charging companies for large file sizes of their games. It would go to show that good things get abused to no end if limits aren't set.

You missed the most bizarre quote:

> These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns

Unfortunately it's not only game development, all modern society seems operate like this.

Twenty years ago I bought a 1TB harddrive... It wasn't very expensive either.

Twenty years on, and somehow that's still 'big'.

Computing progress disappoints me.

  • They say some kind of physical limit, but I think it's really market manipulation in order to bring about the cloud and centralization, control.

I think smaller game sizes would hurt sales. Your first though on a 23gb game when other games are 100 plus is, why is there so little content?