Comment by 12_throw_away

2 days ago

But ... all that duplication was being _done on purpose to achieve better performance_ due to low-level concerns about access times on legacy HDDs?

As far as I can tell, there was no actual low-level optimization being done. In fact, it appears they did not even think to benchmark before committing to 130gb of bloat.

  Further good news: the change in the file size will result in minimal changes to load times - seconds at most. “Wait a minute,” I hear you ask - “didn’t you just tell us all that you duplicate data because the loading times on HDDs could be 10 times worse?”. I am pleased to say that our worst case projections did not come to pass. These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

This reads to me as "we did a google search about HDD loading times and built our game's infrastructure around some random Reddit post without reasoning about or benchmarking our own codebase at any point, ever".