Comment by account42
24 days ago
23 GiB can be cached entirely in RAM on higher end gaming rigs these days. 154 GiB probably does not fit into many player's RAM when you still want something left for the OS and game. Reducing how much needs to be loaded from slow storage is itself an I/O speedup and HDDs are not that bad at seeking that you need to go to extreme lengths to avoid it entirely. The only place where such duplication to ensure linear reads may be warranted is optical media.
They used "industry data" to make performance estimations: https://store.steampowered.com/news/app/553850/view/49158394...
> These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not.
Instead of y'know, running their own game on a hdd.
It's literally "instead of profiling our own app we profiled competition's app and made decisions based on that".
They started off with the competitors data, and then moved on once they had their own data though? Not sure what y'all complaining about.
They made an effort to improve the product, but because everything in tech comes with side effects it turned out to be a bad decision which they rolled back. Sounds like highly professional behavior to me by people doing their best. Not everything will always work out, 100% of the time.
And this might finally reverse the trend of games being >100gb as other teams will be able to point to this decision why they shouldn't implement this particular optimization prematurely
1 reply →
[dead]
If I’m being charitable, I’m hoping that means the decision was made early in the development process when concrete numbers were not available. However the article linked above kinda says they assumed the problem would be twice as bad as the industry numbers and that’s… that’s not how these things work.
That’s the sort of mistake that leads to announcing a 4x reduction in install size.
But if I read it correctly (and I may be mistaken) in actual practice any improvement in load times was completely hidden by level generation that was happening in parallel, making this performance improvement not worth it, since it was hidden by the other process.
>In the worst cases, a 5x difference was reported between instances that used duplication and those that did not.
Never trust a report that highlights the outliers before even discussing the mean. Never trust someone who thinks that is a sane way to use of statistics. At best they are not very sharp, and at worst they are manipulating you.
> We were being very conservative and doubled that projection again to account for unknown unknowns.
Ok, now that's absolutely ridiculous and treating the reader like a complete idiot. "We took the absolute best case scenario reported by something we read somewhere, and doubled it without giving a second thought, because WTF not?. Since this took us 5 seconds to do, we went with that until you started complaining".
Making up completely random numbers on the fly would have made exactly the same amount of sense.
Trying to spin this whole thing into "look at how smart we are that we reverted our own completely brain-dead decision" is the cherry on top.
Are you a working software engineer?
I'm sure that whatever project you're assigned to has a lot of optimization stuff in the backlog that you'd love to work on but haven't had a chance to visit because bugfixes, new features, etc. I'm sure the process at Arrowhead is not much different.
For sure, duplicating those assets on PC installs turned out to be the wrong call.
But install sizes were still pretty reasonable for the first 12+ months or so. I think it was ~40-60GB at launch. Not great but not a huge deal and they had mountains of other stuff to focus on.
3 replies →
They claim they were following industry standard recommendation.
Or, you know, they just didn't really understand industry recommendations or what they were doing.
"Turns out our game actually spends most of its loading time generating terrain on the CPU" is not something you accidentally discover, and should have been known before they even thought about optimizing the game's loading time, since optimizing without knowing your own stats is not optimizing, and they wrote the code that loads the game!
Keep in mind this is the same team that accidentally caused instantly respawning patrols in an update about "Balancing how often enemy patrols spawn", the same group that couldn't make a rocket launcher lock on for months while blaming "Raycasts are hard", and released a mech that would shoot itself if you turned wrong, and spent the early days insisting that "The game is supposed to be hard" as players struggled with enemy armor calculations that would punish you for not shooting around enemy armor because it was calculating the position of that armor incorrectly, and tons of other outright broken functionality that have made it hard to play the game at times.
Not only do Arrowhead have kind of a long history of technical mediocrity (Magicka was pretty crashy on release, and has weird code even after all the bugfixes), but they also demonstrably do not test their stuff very well, and regularly release patches that have obvious broken things that you run into seconds into starting play, or even have outright regressions suggesting an inability to do version control.
"We didn't test whether our game was even slow to load on HDD in the first place before forcing the entire world to download and store 5x the data" is incompetence.
None of this gets into the utterly absurd gameplay decisions they have made, or the time they spent insulting players for wanting a game they spent $60 on to be fun and working.
Which describes both the PS2, PS3, PS4, Dreamcast, GameCube, Wii, and Xbox 360. The PS4 had a 2.5" SATA slot but the idiots didn't hook it up to the chipsets existing SATA port, but added a slow USB2.0<->SATA chip. So since the sunset of the N64 all stationary gaming consoles have been held back by slow (optical) storage with even worse seek times.
Some many game design crimes have a storage limitation at their core e.g. levels that are just a few rooms connected by tunnels or elevators.
And it IS loading noticeably faster now for many users thanks to caching. That said I have to imagine many gaming directly off an hdd however are not exactly flush with ram