← Back to context

Comment by mort96

16 days ago

The negativity towards this is wild. A company followed relatively widely accepted industry practice (lots and lots of other games also have huge sizes on disk for the exact same reason), then eventually they decided to do their own independent testing to check whether said common practice actually makes things better or not in their case, found that it didn't, so they reversed it. In addition, they wrote up some nice technical articles on the topic, helping to change the old accepted industry wisdom.

This seems great to me. Am I crazy? This feels like it should be Hacker News's bread and butter, articles about "we moved away from Kubernetes/microservices/node.js/serverless/React because we did our own investigation and found that the upsides aren't worth the downsides" tend to do really well here. How is this received so differently?

Arrowhead probably deserves more love for breaking the norm but I think it's overshadowed by people finding out for the first time the reason HDDs are so common in gaming setups is companies have been blindly shaving a few seconds off HDD load time off at the cost of 7x the disk space.

If it had been more well known this was the cause of game bloat before then this probably would have been better received. Still, Arrowhead deserves more credit both for testing and breaking the norm as well as making it a popular topic.

  • Part of what makes this outrageous is that the install size itself is probably a significant part of the reason to install the game on an HDD.

    154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.

    Is there a name for the solution to a problem (make size big to help when installed on HDD) in fact being the cause of the problem (game installed on HDD because big) in the first place?

    • > 154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.

      I think War Thunder did it the best:

        * Minimal client 23 GB
        * Full client 64 GB
        * Ultra HQ ground models 113 GB
        * Ultra HQ aircraft 92 GB
        * Full Ultra HQ 131 GB
      

      For example, I will never need anything more than the full client, whereas if I want to play on a laptop, I won't really need more than the minimal client (limited textures and no interiors for planes).

      The fact that this isn't commonplace in every engine and game out there is crazy. There's no reason why the same approach couldn't also work for DLCs and such. And there's no reason why this couldn't be made easy in every game engine out there (e.g. LOD level 0 goes to HQ content bundle, the lower ones go into the main package). Same for custom packages for like HDDs and such.

    • Can any games these days be reliably ran on hdd's with max 200mb/s throughout (at best)? Or does everyone get a coffee and some cookies when a new zone loads? Even with this reduction that will take a while.

      I thought all required ssd's now for "normal" gameplay.

      3 replies →

  • My immediate question is that if all of that was on-disk data duplication, why did it affected download size? Can't small download be expanded into optimal layout on the client side?

    • depending on how the data duplication is actually done (like texture atlasing the actual bits can be very different after image compression) it can be much harder to do rote bit level deduplication. They could potentially ship the code to generate all of those locally, but then they have to deal with a lot of extra rights/contracts to do so (proprietary codecs/tooling is super, super common in gamedev), and

      Also largely cause devs/publishers honestly just don't really think about it, they've been doing it as long as optical media has been prevalent (early/mid 90s) and for the last few years devs have actually been taking a look and realizing it doesn't make as much sense as it used to, especially if like in this case the majority of the time is spent on runtime generation of, or if they require a 2080 as minimum specs whats the point of optimizing for 1 low end component if most people running it are on high end systems.

      Hitman recently (4 years ago) did a similar massive file shrink and mentioned many of the same things.

    • Sure it can - it would need either special pre- and postprocessing or lrzip ("long range zip") to do it automatically. lrzip should be better known, it often finds significant redundancy in huge archives like VM images.

It would be one thing if it was a 20% increase in space usage, or if the whole game was smaller to start with, or if they had actually checked to see how much it assisted HDD users.

But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?

It's kind of exemplary of HD2's technical state in general - which is a mix of poor performance and bugs. There was a period where almost every other mission became impossible to complete because it was bugged.

The negativity is frustration boiling over from years of a bad technical state for the game.

I do appreciate them making the right choice now though, of course.

  • It was a choice, not an oversight. They actively optimised for HDD users, because they believed that failing to do so could impact load times for both SSD and HDD users. There was no speed penalty in doing so for SSD users, just a disk usage penalty.

    Helldivers II was also much smaller at launch than it is now. It was almost certainly a good choice at launch.

    • You make a million decisions in the beginning of every project. I'm certain they made the choice to do this "optimization" at an early point (or even incidentally copied the choice over from an earlier project) at a stage where the disk footprint was small (a game being 7GB when it could've been 1GB doesn't exactly set off alarm bells).

      Then they just didn't reconsider the choice until, well, now.

      3 replies →

  • >But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?

    Have you never worked in an organization that made software?

    Damn near everything can be 10x as fast and using 1/10th the resources if someone bothered to take the time to find the optimizations. RARE is it that something is even in the same order of magnitude as its optimum implementation.

    • But this isn't an optimization. The 150+GB size is the "optimization", one that never actually helped with anything. The whole news here is "Helldivers 2 stopped intentionally screwing its customers".

      I don't see why it's a surprise that people react "negatively", in the sense of being mad that (a) Helldivers 2 was intentionally screwing the customers before, and (b) everyone else is still doing it.

      2 replies →

    • I think what makes this a bit different from the usual "time/value tradeoff" discussion is bloating the size by 6x-7x was the result of unnecessary work in the name of optimization instead of lack of cycles to spend on optimization.

      2 replies →

This is a mischaracterization of the optimization. This isn't a standard optimization that games apply everywhere. It's an optimization for spinning disks that some games apply sometimes. They're expected to measure if the benefits are worth the cost. (To be clear, bundling assets is standard. Duplicating at this level is not.)

This doesn't advance accepted industry wisdom because:

1. The trade-off is very particular to the individual game. Their loading was CPU-bound rather than IO-bound so the optimization didn't make much difference for HDDs. This is already industry wisdom. The amount of duplication was also very high in their game.

2. This optimization was already on its way out as SSDs take over and none of the current gen consoles use HDDs.

I'm not mad at Arrowhead or trying to paint them negatively. Every game has many bugs and mishaps like this. I appreciate the write-up.

At one point, I think it was TitanFall2, the pc port of a game deliberately converted it's audio to uncompressed wav files in order to inflate the install size, They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.

When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.

People probably feel the same about this, why were they so disrespectful of our space and bandwidth in the first place? But I agree it is very nice that they wrote up the details in this instance.

  • > When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.

    Software developers of all kinds (not just game publishers) have a long and rich history of treating their users' compute resources as expendable. "Oh, users can just get more memory, it's cheap!" "Oh, xxxGB is such a small hard drive these days, users can get a bigger one!" "Oh, most users have Pentiums by now, we can drop 486 support!" Over and over we've seen companies choose to throw their users under the bus so that they can cheap out on optimizing their product.

    • Maybe that'll start to change since ram is the new gold and who knows what the AI bubble will eat next

  • > They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.

    This doesn't even pass the sniff test. The files would just be compressed for distribution and decompressed on download. Pirated games are well known for having "custom" installers.

    • >The files would just be compressed for distribution and decompressed on download

      All Steam downloads are automatically compressed. It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.

      13 replies →

  • I remember seeing warez game releases in the late 90s that had custom packaging to de-compress sound effects that were stored uncompressed in the original installer.

    It seems no one takes pride in their piracy anymore.

It's because shitting on game devs is the trendy thing these days, even among more technically inclined crowds unfortunately. It seems like there's a general unwillingness to accept that game development is hard and you can't just wave the magic "optimize" wand at everything when your large project is also a world of edge cases. But it seems like it should be that simple according to all the armchair game devs on the internet.

  • The level of work that goes into even “small” games is pretty incredible. When I was a grad student another student was doing their (thesis based, research focused) masters while working at EA on a streetfighter(?) game.

    The game programming was actually just as research focused and involved as the actual research. They were trying to figure out how to get the lowest latency and consistency for impact sounds.

  • the engineers disease: "i'm smarter than you and I need to prove it, and we're so smart we wouldn't have shipped this code in the first place" bla bla bla

    also keep in mind that modern gaming generates more revenue than the movie industry, so it's in the interests of several different parties to denigrate or undermine any competing achievement -- "Bots Rule Every Thing Around Me"

  • For me it's not so much about shitting on game devs as it is about shitting on the ogres that run game companies. Any of us who have done development should understand we have little control over scope and often want to do more than the business allows us to.

    • That is completely ok in my opinion. It's just most discourse I come across treats the developers as complete amateurs who don't know what they're doing. As someone who's a professional dev myself I just can't get behind bashing the people doing the actual work when I know we're all dealing with the same business realities, regardless of industry.

  • Meh, the same is true for almost every discussion on the internet, everyone is an expert armchair for whatever subject you come across, and when you ask them about their experience it boils down to "I read lots of Wikipedia articles".

    I mean I agree with you, that it is trendy and seemingly easy, to shit on other people's work, and at this point it seems to be a challenge people take up upon themselves, to criticise something in the most flowery and graphic way as possible, hoping to score those sweet internet points.

    Since maybe 6-7 years I stopped reading reviews and opinions about newly launched games completely, the internet audience (and reviewers) are just so far off base compared to my own perspective and experience that it have become less than useless, it's just noise at this point.

    • I wish many people's "expertise" atleast amounted to reading wikipedia. It seems for many that is too much and they either make crap up on the spot or latch onto whatever the first thing they find that will confirm their biases regardless of how true it is.

  • There has long been a trend that "software engineers" and "computer scientists" both have been rather uninterested in learning the strategies that gaming developers use.

    Really, the different factions in software development are a fascinating topic to explore. Add embedded to the discussion, and you could probably start fights in ways that flat out don't make sense.

Many players perceive Arrowhead as a pretty incompetent and untrustworthy developer. Helldivers has suffered numerous issues with both performance and balancing. The bugs constantly introduced into the game (not the fun kind you get to shoot with a gun) have eroded a lot of trust and good will towards the company and point towards a largely non-existent QA process.

I won’t state my own personal views here, but for those that share the above perspective, there is little benefit of the doubt they’ll extend towards Arrowhead.

The negativity comes from the zero effort they put into this prior to launch. Forcing people to download gigs of data that was unnecessary.

Game studio's no longer care how big their games are if steam will still take them. This is a huge problem. GTA5 was notorious for loading json again, and again, and again during loading and it was just a mess. Same for HD2, game engines have the ability to only pack what is used but its still up to the developers to make sure their assets are reusable as to cut down on size.

This is why Star Citizen has been in development for 15 years. They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

The anger here is real. The anger here is justified. I'm sick of having to download 100gb+ simply because a studio is too lazy and just packed up everything they made into a bundle.

  • > They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

    Reminds me of the Crack.com interview with Jonathan Clark:

    Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.

    https://web.archive.org/web/20160125143707/http://www.loonyg...

    • this is very very common as there's only a handful of school that teach this. Displacement mapping with a single poly is the answer. Game dev focused schools have this but any other visual media school it's "build a brick, array the brick 10,000 times".

  • There were 20 people working on this game when they started development. Total. I think they expanded to a little over 100. This isn't some huge game studio that has time to do optimization.

    GTA5 had well over 1000 people on its team.

    • Size of team has no bearing in this argument. Saying they were small so they get a pass at preventing obscene download sizes is like saying “Napster was created by one man, surely he shouldn’t be accountable” but he was.

      When making a game, once you have something playable, is to figure out how to package it. This is included in that effort. Determining which assets to compress, package, and ship. Sometimes this is done by the engine. Sometimes this is done by the art director.

      2 replies →

The negativity wasn't created in a vacuum. ArrowHead has a long track record of technical mishaps and a proven history of erasing all evidence about those issues, without ever trying to acknowledge them. Reddits, Discord and YouTube comment section are heavily moderated. I suspect there's might be a 3rd party involved in this, which doesn't forward any technical issues, if the complaint involves any sign of frustration. Even the relation with their so called "Propaganda Commanders" (official moniker for their youtube partner channels) has been significantly strained in two cases, for trivialities.

It took Sony's intervention to actually pull back the game into playable state once - resulting in the so called 60 day patch.

Somehow random modders were able to fix some of the most egregiously ignored issues (like an enemy type making no sound) quickly and effectively. ArrowHead ignored, then denied, then used the "gamers bad" tactic, banned people pointing it out. After long time, finally fixing it and trying to bury it in the patch notes too.

They also have been caught straight up lying about changes, most recent one was: "Apparently we didn't touch the Coyote", where they simply buffed enemies resistance to fire, effectively nerfing the gun.

  • Sony nearly killed all good will the game had accrued when they tried to use the massive player base as an opportunity to force people into their worthless ecosystem. I don't think Sony even has the capability to make good technical decisions here, they are just the publisher. It was always Arrowhead trying to keep up with their massive success that they clearly weren't prepared for at all. In the beginning they simply listened to some very vocal players' complaints, which turned out to not be what the majority actually wanted. Player driven development is hardly ever good for a game.

    • So, players wanting:

      - To their PC not reboot and BSOD (was a case few months ago)

      - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)

      - Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)

      - Continue to run, even when anybody else from the team was stimming (yes, any person in the team stimming caused others to slow down)

      - Actually be able to hear one of the biggest enemies in the game

      - To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)

      - Be able to use chat, when in the vehicle (this would result in using your primary weapon)

      - Be able to finish drill type mission (this bugs out a lot still)

      - Not be attacked by enemies that faze through buildings

      - Not be attacked by bullets passing through terrain, despite the player bullets being stopped there

      are just vocal player's complaints? A lot of those bugs went totally unaddressed for months. Some keep coming back in regressions. Some just are still ongoing. This is only a short list of things I came across, while casually playing. It's a rare sight to have a full OP without an issue (even mission hardlocks still).

      About Sony - I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.

      13 replies →

Probably because many are purists. It is like how anything about improving Electron devolves into "you shouldn't use Electron."

Many would consider this a bare minimum rather than something worthy of praise.

  • > Probably because many are purists. It is like how anything about improving Electron devolves into "you shouldn't use Electron."

    The Electron debate isn't about details purism, the Electron debate is about the foundation being a pile of steaming dung.

    Electron is fine for prototyping, don't get me wrong. It's an easy and fast way to ship an application, cross-platform, with minimal effort and use (almost) all features a native app can, without things like CORS, permission popups, browser extensions or god knows what else getting in your way.

    But it should always be a prototype and eventually be shifted to native applications because in the end, unlike Internet Explorer in its heyday which you could trivially embed as ActiveX and it wouldn't lead to resource gobbling, if you now have ten apps consuming 1GB RAM each just for the Electron base to run, now the user runs out of memory because it's like PHP - nothing is shared.

    • Or these devs & users can migrate to a PWA. Which will have vastly less overhead. Because it is shared, because each of those 10 apps you mention would be (or could be, if they have ok data architecture) tiny.

      1 reply →

    • Each person seems to have their own bugbear about Electron but I really doubt improving Electron to have shared instances a la WebView2 would make the much of a dent in the hate for it here.

    • Removing layers is hard though, better to have electron host a WASM application which will become a new "native" that gets argued semantically.