Comment by butz

2 days ago

Sadly, nowadays very few, if any, game developers care about performance or optimizations. Look at recent headline about "Helldivers 2 devs slash install size from 154GB to 23GB" and it was done by simply deduplicating assets. Gone are the days of finding inredible ways to use less opcodes that game would feel smoother.

But ... all that duplication was being _done on purpose to achieve better performance_ due to low-level concerns about access times on legacy HDDs?

  • As far as I can tell, there was no actual low-level optimization being done. In fact, it appears they did not even think to benchmark before committing to 130gb of bloat.

      Further good news: the change in the file size will result in minimal changes to load times - seconds at most. “Wait a minute,” I hear you ask - “didn’t you just tell us all that you duplicate data because the loading times on HDDs could be 10 times worse?”. I am pleased to say that our worst case projections did not come to pass. These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.
    

    This reads to me as "we did a google search about HDD loading times and built our game's infrastructure around some random Reddit post without reasoning about or benchmarking our own codebase at any point, ever".

I think we can all agree that performance is often an afterthought to game developers, particular in bigger productions, but HD2 is sort of a bad example for that.