Comment by breakingcups
5 years ago
It is absolutely unbelievable (and unforgivable) that a cash cow such as GTA V has a problem like this present for over 6 years and it turns out to be something so absolutely simple.
I do not agree with the sibling comment saying that this problem only looks simple and that we are missing context.
This online gamemode alone made $1 billion in 2017 alone.
Tweaking two functions to go from a load time of 6 minutes to less than two minutes is something any developer worth their salt should be able to do in a codebase like this equipped with a good profiler.
Instead, someone with no source code managed to do this to an obfuscated executable loaded with anti-cheat measures.
The fact that this problem is caused by Rockstar's excessive microtransaction policy (the 10MB of JSON causing this bottleneck are all available microtransaction items) is the cherry on top.
(And yes, I might also still be salty because their parent company unjustly DMCA'd re3 (https://github.com/GTAmodding/re3), the reverse engineered version of GTA III and Vice City. A twenty-year-old game. Which wasn't even playable without purchasing the original game.)
> The fact that this problem is caused by Rockstar's excessive microtransaction policy (the 10MB of JSON causing this bottleneck are all available microtransaction items) is the cherry on top.
For what it's worth, 10MB of JSON is not much. Duplicating the example entry from the article 63000 times (replacing `key` by a uuid4 for unicity) yields 11.5MB JSON.
Deserialising that JSON then inserting each entry in a dict (indexed by key) takes 450ms in Python.
But as Bruce Dawson oft notes, quadratic behaviour is the sweet spot because it's "fast enough to go into production, and slow enough to fall over once it gets there". Here odds are there were only dozens or hundreds of items during dev so nobody noticed it would become slow as balls beyond a few thousand items.
Plus load times are usually the one thing you start ignoring early on, just start the session, go take a coffee or a piss, and by the time you're back it's loaded. Especially after QA has notified of slow load times half a dozen times, the devs (with fast machines and possibly smaller development dataset) go "works fine", and QA just gives up.
> Plus load times are usually the one thing you start ignoring early on, just start the session, go take a coffee or a piss, and by the time you're back it's loaded.
In GTA V, when I tried to enjoy multiplayer with my friends the abysmal load times were what killed it for me.
You actually have to load into the game world - which takes forever - before having a friend invite you to their multiplayer world - which takes forever, again.
So both a coffee, and a piss. Maybe they fixed that now?
Then when you want to actually do an activity like a deathmatch you have to wait for matchmaking and then the loading - takes forever. Once you are finally in a match it's okay but as soon as the match ends you have to wait for the world to load again and then queue again which takes bloody forever. Spend 2hrs playing the game and have only a few matches, more time spent looking at loading screens than actually playing anything.
4 replies →
I agree. I played GTA online for a bit and quite enjoyed it but I haven't touched it in a while and the insane loading times are a big reason why.
It kind of baffles me that they haven't bothered to fix this trivial issue when the result is to cut 4 entire minutes of loading time.
5 replies →
> So both a coffee, and a piss.
Reminds me on loading "G.I. Joe" (from Epyx) on the C64 with a 1541 floppy disk. However, the long loads came after every time you died and meant you also had to swap 4 disks.
10 replies →
They didn't fix it. I tried a few days ago, because it's a really fun game... except for these seemingly easy to fix issues that are huge barriers.
> You actually have to load into the game world - which takes forever - before having a friend invite you to their multiplayer world - which takes forever, again.
Is that... the same problem? Is microtransaction data different in your friend's multiplayer world than it is in the normal online world?
1 reply →
Was new guy at a startup. Soon noticed that chuck Norris was in our compiled JavaScript. Turned out Someone had included the entire test suite in production deploy.
Had been like that for nearly a year. A few minutes of work brought out client js file from 12MB to less then 1mb.
Ha, this is one of the reasons why I also include outlandish and wrong-looking stuff in unit tests. If we see where it doesn't belong, then we know for sure that we are doing something wrong.
Most often I use unicode strings in unexpected alphabets (i.e. from languages that are not supported by our application and that are not used by the mother tongue of any developer from our team). This includes Chinese, Malayalam, Arabic and a few more. There was a time when I wanted to test the "wrong data" cases for some deserialising function, and I was part annoyed and part amusingly surprised to discover that doing Integer.parseInt("٤٣٠٤٦٧٢١") in Java does parse the arabic digits correctly even without specifying any locale.
1 reply →
> Soon noticed that chuck Norris was in our compiled JavaScript
Is that a library? Or the string "Chuck Norris"?
7 replies →
related: Guy Fieri in node https://nodesource.com/blog/is-guy-fieri-in-your-node-js-pac...
You mention quadratic behaviours and there's probably some truth to that, but it seems to me that it's partly a C++ problem. In any other langauge nobody would even consider hacking up JSON parsing using a string function. They'd use the stdlib functional if available or import a library, and this problem wouldn't exist.
A lot of other languages make use of the c standard library functions to parse floats (and to do various trigonometric functions), so they may be more similar than you imagine.
1 reply →
But C++ had at least a hash_set/hash_map since forever (or just set/map which are still better than this)
I'm sure there are libraries to parse json in C++ or at least they should have built something internally if it's critical, instead they have someone less experienced build it and not stress test it?
2 replies →
Rapidjson
That QA bit is too true, "it works for me!" shrug.
> Here odds are there were only dozens or hundreds of items during dev so nobody noticed it would become slow as balls beyond a few thousand items.
Might be, but this particular issue has been raised by thousands of players and ignored for *years*.
Yea, given how easy it was for the author of the post to find it, I would guess that literally nobody in the last decade bothered to run a profiler to see where they were spending time.
The only possible explanation is that management never made it a priority.
I could see this happening. A project/product manager thinking "We could spend $unknown hours looking for potential speedups, or we could spend $known hours implementing new features directly tied to revenue"
Which is kind of ironic since this fix would keep players playing for more time, increasing the chances that they spend more money.
1 reply →
For the online games I worked on (a few of the recent NFS games) the items database was similar to the final set quite early in production and we kept an ongoing discussion about load times.
I really liked this article, but I am a bit surprised that this made it into production. I have seen a few instances of this type of slowdowns live for very long, but they tend to be in compile times or development workflow, not in the product itself.
And because a merchandising push in many games may be another 10-50 items, the first couple times the % increase is high but the magnitude is low (.5s to 1s) and by the time you're up to 1000, the % increase is too small to notice. Oh it took 30 seconds last week and now it's 33.
Boiling the frog, as it were. This class of problems is why I want way more charts on the projects I work on, especially after we hit production. I may not notice an extra 500ms a week, but I'm for damn sure going to notice the slope of a line on a 6 month chart.
I heard that Chrome team had this KPI from very early on - how much time it takes for Chrome to load and it stayed the same to date. i.e. they can't make any changes that will increase this parameter. Very clever if you ask me
Google lately "optimized" Chrome "time for the first page to load" by no longer waiting for extensions to initialize properly. First website you load bypasses all privacy/ad blocking extensions.
6 replies →
It would be interesting to see what JSON library they used that uses scanf for parsing numbers. Nothing like a painters algorithm type scenario to really slow things down, but also JSON numbers are super simple and don't need all that work. That is hundreds of MB's of unneeded searching for 0s
Unlikely to be a library, either it's libc or it's homegrown.
The only thing most game companies do when it comes to external libraries is to copy the source code of it into their repo and never update it, ever.
OpenSSL is this way, it's a required installation for Playstation but debugging it is seriously hard, and perforce (the games industries version control of choice) can't handle external dependencies. Not to mention visual studio (the game industries IDE of choice..) can't handle debugging external libraries well either.
So, most game studios throw up the hands, say "fuck it" and practice a heavy amount of NIH.
4 replies →
hmm.. the entire pricing table for Google Cloud (nearly 100k skus and piles of weirdness) was only ~2mb... seems pretty big.
But is quadratic the real issue ? Isn't that a developer answer ?
The best algorithm for small, medium or a large size are not the same and generally behave poorly in the other cases. And what is small? Medium? Large?
The truth is that there is no one size fits all and assumptions need to be reviewed periodically and adapted accordingly. And they never are... Ask a DBA.
> But is quadratic the real issue ?
Yes. That is literally the entirety of the issue: online loading takes 5mn because there are two accidentally quadratic loops which spin their wheel.
> The best algorithm for small, medium or a large size are not the same and generally behave poorly in the other cases.
“Behaves poorly” tends to have very different consequences: an algorithm for large sizes tends to have significant set up and thus constant overhead for small sizes. This is easy to notice and remediate.
A naive quadratic algorithm will blow up your production unless you dev with production data, and possibly even then (if production data keeps growing long after the initial development).
quadratic is a fancy way of saying "this code is super fast with no data, super slow once you have a decent amount"
The problem is that when you double the amount of stuff in the JSON document, you quadruple (or more) the scanning penalty in both the string and the list.
Why quadruple? Because you end up scanning a list which is twice as long. You have to scan that list twice as many times. 2x2 = 4. The larger list no longer fits in the fast (cache) memory, among other issues. The cache issue alone can add another 10x (or more!) penalty.
4 replies →
In the small case here, there is no meaningful difference in speed between parsers. Using a quadratic algorithm has no advantage and is just an incorrect design.
The popular view is that companies who write software know how to prioritise, so if a problem like this isn't fixed, it's because they've done the calculations and decided it's not worthwhile.
I disagree. If there are no internal incentives for the people who know how to fix this to fix it, or if there's no path from them thinking fixing it could improve revenues to being assigned the ticket, things like this won't get fixed. I can fully believe the load times will result in fewer users and lower expenditure.
I think we'll see this happen with Facebook Messenger. Both the apps and the website have become slow and painful to use and get worse every month. I think we'll start to see engagement numbers dropping because of this.
You have just described why I laugh anytime someone complains that government is inefficient. ANY organization of sufficient size is "inefficient" because what a large organization optimizes for (for reasons I cannot explain) cannot align with what that organization's customers want optimized.
With the added difference that governments also have to be far more procedural by virtue of the way they are set up. Regardless of size they are accountable and responsible to a far higher degree in the eyes of the population they represent so there is a legitimate reason to be "slow".
In games the added reason to be slow is that game code is by definition some of the least mission critical code one could find (competes with 90% of the internet web code). Your Linux or Windows code might run a hospital's infrastructure or a rover on another planet. A game on the other hand can launch with bugs the size of your windshield, and can stay like that forever as long as people still pay. And people will pay because games are not unlike a drug for many people.
As such most game coding teams and coders are "trained" to cut every corner and skimp on every precaution. They're not needed beyond a very low baseline as far as software is concerned.
Look at the amount of bugs or cheats incredibly popular games like GTA or CoD have. These are billion dollar a year franchises that leave all this crap on the table despite all the money they make. They have all the resources needed, it's a conscious call to proceed like this, to hire teams that will never be qualified enough to deliver a high quality product and will be encouraged to cut corners on top of that.
Source: a long time ago I worked for a major game developer in a senior management role (unrelated to the dev activity) and left after feeling like "facepalm" for too long in every single SM meeting.
3 replies →
> for reasons I cannot explain
Any sufficiently large institution, over time, will prioritise self-preservation over achieving their core mission. This is sociology 101. Once a company has enough users to make it hard or impossible to measure immediate performance, self-preservation is achieved with internal manoeuvering and selling to execs.
14 replies →
Organizations are like spheres. Only a small part of the sphere has an exposed surface that is in contact with the outside world. As you grow the sphere most of the mass will be inside the sphere, not near the surface.
And I laugh every time people claim that it is only governments that can be inefficient. Most large commercial companies are inefficient and almost not functioning at all.
That the process breaks down in some cases doesn't mean they don't know how to prioritize. They clearly know how to prioritize well enough to make a wildly successful and enjoyable game. That doesn't mean no bad decisions were made over the last decade or so of development.
Like anything else, things will be as bad as the market allows. So I'd expect monopolies to do a worse and worse job of making good decisions and companies in competitive fields to do a better and better job over time. Thus the difference between TakeTwo and Facebook, and the need for lower cost of entry and greater competition in all economic endaevors where efficiency or good decision making is important.
Does it? Just because something is good or successful doesnt mean it was made well. Thats why we are seeing stories of crunch, failures of management compensated by extreme overwork.
2 replies →
> I think we'll see this happen with Facebook Messenger. Both the apps and the website have become slow and painful to use and get worse every month.
The messenger website has been atrocious for me lately. On my high-powered desktop, it often lags a bit, and on my fairly high-end laptop, it's virtually unusable. I thought it must be something I changed in my setup, but it's oddly comforting to hear that I'm not the only one with such issues.
For me it’s google maps, it has gotten so freaking slow both on mobile and on desktop. Actually google docs and sheet are the same.
1 reply →
Try https://www.messenger.com/desktop
3 replies →
> if a problem like this isn't fixed, it's because they've done the calculations and decided it's not worthwhile.
If it ever reached the point where it had to be an item in a priority list, it's already a failure. Some developer should have seen the quadratic behavior and fixed it. It's not the type of thing that should ever even be part of a prioritized backlog. It's a showstopper bug and it's visible for every developer.
> I can fully believe the load times will result in fewer users and lower expenditure.
Is GTA online still attracts new users in droves? I doubt.
If the old users live with the loading time for years, they are likely to continue living with it. It would be nice if Rockstar fixes it, but I doubt it would be anything except a PR win.
I rarely play it and the load time is actually the main factor. If I have an hour to play a game waiting for GTA V to load unfortunately feels like a waste of time and a chore, so I play something else.
Before GTA online they entertained themselves in other ways. Eventually, they'll move on. The more friction there is to continue playing GTA online, the easier it is for there to be something to pull them away. Rockstar are now competing to be a use of someone's time, not for them to buy the game.
1. people experiencing this issue have already bought the game, so there's little incentive here.
2. we can be reasonably sure people will buy newer GTA installments regardless of whether this bug is fixed or not.
but:
3. if there's still money to be made from microtransactions this is a huge issue and would absolutely be worthwhile, imo.
> I think we'll see this happen with Facebook Messenger. Both the apps and the website have become slow and painful to use and get worse every month. I think we'll start to see engagement numbers dropping because of this.
In fact, I think the iOS app for FB Messenger did get a redesign due to problems and it’s rewritten from scratch? I remember being pleasantly surprised after the big update… It became lightweight, integrates well with iOS and supports platform features.
On the other hand, the desktop app or the website is a shitshow :-(
The iOS app is really much more performant now than it was a couple of years ago. Significantly smaller, and quicker to start up
I would think this would be one of the biggest revenue / developer_time changes in company history, considering how incredibly profitable online is.
I imagine the conversation between the programmer(s) and management went exactly like this:
Management: So, what can we do about the loading times?
Programmer(s): That's just how long it takes to load JSON. After all, the algorithm/function couldn't be more straightforward. Most of the complaints are probably coming from older hardware. And with new PC's and next-gen consoles it probably won't be noticeable at all.
Management: OK, guess that's that then. Sucks but nothing we can do.
Management had no idea of knowing whether this is true or not -- they have to trust what their devs tell them. And every time over the years someone asked "hey why is loading so slow?" they get told "yeah they looked into it when it was built, turns out there was no way to speed it up, so not worth looking into again."
And I'm guessing that while Rockstar's best devs are put on the really complex in-game performance stuff... their least experienced ones are put on stuff like... loading a game's JSON config from servers.
I've seen it personally in the past where the supposedly "easy" dev tasks are given to a separate team entirely, accountable to management directly, instead of accountable to the highly capable tech lead in charge of all the rest. I've got to assume that was basically the root cause here.
But I agree, this is incredibly embarrassing and unforgiveable. Whatever chain of accountability allowed this to happen... goddamn there's got to be one hell of an internal postmortem on this one.
I can pretty much guarantee that there was no discussion with management like that. From experience, live ops games are essentially a perpetually broken code base that was rushed into production, then a breakneck release schedule for new features and monetization. I've personally had this conversation a few times:
Programmer: Loading times are really slow, I want to look into it next sprint.
Management: Feature X is higher priority, put it in the backlog and we'll get to it.
At my last job I had that conversation several times :( Our website would regularly take 30s+ for a page to load, and we had an hour of scheduled downtime each week, because that’s how long it took the webapp to restart each time code was pushed. “Scheduled downtime doesn’t count as downtime, we still have the three 9’s that meets our SLA, and there’s nothing in the SLA about page load times. Now get back to building that feature which Sales promised a client was already finished”...
Aside from being generally shameful, the real kicker was that this was a "website performance & reliability" company x__x
3 replies →
Former gamedev here. I can vouch for this conversation. The idea that management would ask about slow loading times doesn't seem to be too realistic in my experience.
I once did get to look an optimsing a bad load time... only because at that point it was crashing due to running out of memory. (32-bit process)
In this case it was a JSON, but using .NET so Newtonsoft is at lease efficient. The issues were many cases of: * Converting string to lowercase to compare them as the keys were case insensitive. (I replaced with a StringComparison.OrderinalCaseInsensitve) * Reduntant Dictionary's * Using Dictionary<Key, Key> as a Set. replaced with HashSet.
The data wasn't meant to be that big, but when a big client was migrated it ended up with 200MB of JSON. (If there data was organised differently it would've been split accross many json blobs instead)
It would also be nice to handle it alls as UTF8 like System.Text.Json does. That would half all the strings saving a fair bit. (I mean the JSON blob it starts with is converted to UTF-16 because .NET)
Ughh Every time I want to fix anything I have to sneak it in with something else I'm working on, or wait so long to get approval that I forgot all the details.
2 replies →
Normally I'd agree with you but this particularly problem is SO visible, and experienced by everyone, that I have to think management must have looked into it. I mean, it's the loading screen. They couldn't not encounter it themselves, every time.
But I could be wrong. I've only worked with sites and apps not gaming.
4 replies →
I experienced this to incredible extremes. It was taking our clients two hours to load their data into an external tool that they use, daily, to view that data. This was caused by our use of a data store that was only half supported by that tool. I showed that if we took two weeks to switch to the other data store, their load times would be 30 seconds. It took two years, and a new manager, to get that implemented. Unfortunately, most of the clients are still using the old data store. They haven't had time to switch to the new version...
I had that exact conversation at my old job. They only started listening to me when some requests started timing out because of Heroku's 30 second request duration limit.
Same thing happened with memory consumption. The problem was ignored until we regularly had background jobs using > 4 GB of memory, causing cascading failures of our EC2 instances and a whole bunch of strange behaviour as the OOM killer sniped random processes.
Alternatively (from my experience):
Programmer(s): Can we set aside some time to fix the long loading times?
Management: No, that won't earn us any money, focus on adding features
That's not how it's been in any of my past gamedev jobs.
I work LiveOps and usually long loading times are something that we would take seriously, as it negatively impacts the reputation of the game.
The old maxim of "Premature optimization is the root of all evil" has over time evolved to "If you care one iota about performance, you are not a good programmer".
That belief is getting a bit outdated now that computing efficiency is hitting walls. Even when compute is cheaper than development, you're still making a morally suspect choice to pollute the environment over doing useful work if you spend $100k/yr on servers instead of $120k/yr on coding. When time and energy saved are insignificant compared to development expense is of course when you shouldn't be fussing with performance.
I don't think the anti-code-optimization dogma will go away, but good devs already know optimality is multi-dimensional and problem specific, and performance implications are always worth considering. Picking your battles is important, never fighting them nor knowing how is not the trick.
I agree 100% - the whole cheery lack of care around optimization to the point of it becoming 'wisdom' could only have happened in the artifice of the huge gains in computing power year on year.
Still, people applying optimizations that sacrifice maintainability for very little gain or increase bugs are still doing a disservice. People who understand data flow and design systems from the get-go that are optimal are where it's at.
And fwiw, the full quote is:
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
Yet we should not pass up our opportunities in that critical 3%.
Also, it’s not “never optimise”. It’s “only optimise once you’ve identified a bottleneck”. I guess in a profit-making business you only care about bottlenecks that are costing money. Perhaps this one isn’t costing money.
12 replies →
Then again, using the correct data structure is not really optimisation. I usually think of premature optimisation as unnecessary effort, but using a hash map isn't it.
6 replies →
That doesn't really apply here. I don't even play GTA V but the #1 complain I've always heard for the past 6 years is that the load times are the worst thing about the game. Once something is known to be the biggest bottleneck in the enjoyment of your game, it's no longer "premature optimization". The whole point of that saying is that you should first make things, then optimize things that bring the bring the most value. The load time is one of the highest value things you can cut down on. And the fact that these two low hanging fruit made such a big difference tells me they never gave it a single try in the past 6 years.
Sure it does apply. These complaints come out after the game has been released. They should have optimized this before they released, while they even designed the system. However that's considered premature optimization, when in fact it's just bad design.
4 replies →
We used to start up Quake while we waited then we'd forget about GTAO. Later we'd discover GTA had kicked us out for being idle too long. Then we'd just close it.
That should be embarrassing for Rockstar but I don't think they would even notice.
The problem here isn't a lack of optimization, it's a lack of profiling. Premature optimization is a problem because you will waste time and create more complex code optimizing in places that don't actually need it, since it's not always intuitive what your biggest contributors to inefficiency are. Instead of optimizing right away, you should profile your code and figure out where you need to optimize. The problem is that they didn't do that.
I'd like to add, while GTA is running I'm really impressed by its performance, even / especially when it was on the PS3; you could drive or fly at high speed through the whole level, see for miles and never see a loading screen. It is a really optimized game, and that same level is continued in RDR2.
Which makes the persisting loading issue all the weirder.
I think this part of the Knuth's quote is central:
> Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered.
And he is explicitly advocating for optimizing the critical part:
> Yet we should not pass up our opportunities in that critical 3%.
And somehow, people have latched onto the catchphrase about "early optimization" that was taken out of context.
I went back and had a look at that maxim a few years ago and found that it actually doesn't say what many people claims it says. And definitively not as the blanket excuse for slow code that it has always to some degree been used as.
The reason for the misunderstanding is that the kinds of practices it actually talks about are uncommon today. People would often take stuff written in higher level languages and reimplement them in assembler or machine code. Which makes them more time-consuming to change/evolve.
It also isn't like it is hard to figure out which part of a piece of software that is taking up your runtime these days. All worthwhile languages have profilers, these are mostly free, so there is zero excuse for not knowing what to optimize. Heck, it isn't all that uncommon for people to run profiling in production.
Also, it isn't like you can't know ahead of time which bits need to be fast. Usually you have some idea so you will know what to benchmark. Long startup times probably won't kill you, but when they are so long that it becomes an UX issue, it wouldn't have killed them to have a look.
Back in the day when people talked about premature optimization it was about trivial things like people asking on stackoverflow whether a++, ++a or a += 1 is faster. It obviously is a net loss since ultimately it literally doesn't matter. If matters to you, you are already an expert in the subject and should just benchmark your code.
I am not sure I would call that "evolution".
It's not just believable, but it's normal. I have spent quite a bit of my career maintaining software, and I don't recall one employers where low hanging fruit like this wasn't available everywhere.
The problem is not that developers can't optimize things: you will find some developers capable of figuring this problem out anywhere. What makes this low hanging fruit so popular is the fact that we aren't measuring enough, and even when we do, we aren't necessarily prioritizing looking into things that are suspiciously slow.
In the case of this example, the issue is also client-side, so it's not as if it's costing CPU time to Rockstar, so it's unlikely you'll have someone who can claim their job description includes wondering if the load times are worth optimizing. When problems like this one get solved is because someone who is very annoyed by the problem and either convinces a developer to even look into the problem. Most of the time, the people that suffer, the people that get to decide how to allocate the time, and the people that are in a position to evaluate the root cause of the problem never even get to talk to each other. It's the price we often pay for specialization and organizations with poor communication.
Organizations where the people deciding what has to be done next, and where the company culture dictates that the way forward is to either complete more tickets faster, or find ways to be liked by your manager, are not going to be fostering the kind of thinking that solves a problem like this one, but that's a lot of what you find in many places. A developer with a full plate that is just working on the next feature isn't going to spend their time wondering about load times.
But instead we end up blaming the developers themselves, instead of the culture that they swim in.
Hear hear, we should make a punch a dummy manager/BA/Code standards 'lead' day...
This code looks like someone with almost no experience hacked it together but because they were an intern and likely Rockstar is a toxic place to work, it never gets prioritized to be fixed.
I think if managers prioritized cycle time, metri s more, they'd find that they are encouraging a lot of practices which lead to horrible efficiencies - "measure twice cut once" is a positive mantra which leads to more solid designs with less bugs.
Agile sort of addressed this problem but unfortunately only at small size scales. Iteration and story capacity got overprioritized over quality, customer engagement, and self-leading teams.
Plus things such as scaled agile suffer from the oxymoron of planning fast iteration - if you have a master plan you lose the ability to respond to change and iterate, or you allow iteration and you must accept if any team iterates the whole team must discard the plan...which at some point means you either accept high cycle times or you figure out a way to decouple functionality to the extent the planning becomes the standard fallacy of waterfall - wasting meeting time to go over a plan that isn't based on anything.
I suspect that the core engine programmers moved onto other projects long ago, leaving GTA:O running with mostly artists and scenario designers to produce more DLC.
This bug wouldn't present in the first couple years with the limited amount of DLC, so by the time it got ridiculous there wasn't anyone left with the confidence to profile the game and optimize it. A junior dev could fix this, but would probably assume that slow loads are a deep complex engine problem that they won't be able to fix.
Alternatively, management would declare that there's too much risk doing more technical engine work, and not sign off on any proposed "minor" optimizations because it's too risky.
> This bug wouldn't present in the first couple years with the limited amount of DLC
GTA Online loading times have been infamous for a very long time. They were already unjustifiably bad when they released the game for PC, and at that point engine programmers would surely be involved.
The 1.5 minute default load time is also ridiculous.
This is very much the likely scenario. The money is in further DLC. The existing GTAO engine is "done" from their perspective.
I'd guess also that the next version of the base engine is in RDR2 or later and doesn't have these issues. But at the same time they likely wouldn't backport the changes for fear of cost overruns.
Something I've noticed in highly successful companies is that problems never get fixed because the sound of the money printer from the core business is deafening.
Our customer portal loads 2 versions of React, Angular, Knockout and jQuery on the same page? Doesn't matter, it's printing billions of dollars.
Rockstar's money printer is so loud that they don't care about problems.
Same thing for Valve, their money printer is so loud that they barely bother to make games anymore and let the Steam client languish for years (how did they let Discord/Twitch happen?).
>Same thing for Valve, their money printer is so loud that they barely bother to make games anymore and let the Steam client languish for years (how did they let Discord/Twitch happen?).
Not sure that's a fair criticism.
Alyx was widely praised. Artifact... Wasn't. I don't know about Dota Overlords. And that's just the last couple of years.
They've also developed hardware like SteamLink, Steam Controller, some high-end VR gear...
They develop a LOT. They just don't release a while lot.
I agree there should be a lot more work and effort in the client. And they constantly fuck up handling their esports.
But I don't think "barely bother to make games anymore" isn't one of them.
It's fair. They are not a games developer anymore. Alyx was good but a boutique game made by a behemoth games marketplace company and before it came out, almost a decade had passed since Portal 2...
1 reply →
Look at sharepoint. It's a total nightmare of a platform to develop on, but the people just adapted and built their businesses on it
I suspect sharepoint is the platform version of excel & VBA. You & I might hate it, but it gets powerful capabilities into the hands of regular people.
sharepoint is probably many non-engineer's very first exposure to actual version control with checkouts, checkins, version history, and merge.
2 replies →
What makes it a nightmare? Can you share a few issues you've run into?
6 replies →
I still have no idea what Sharepoint even is. The way I’ve always seen it used is a way to host sites with file hosting tied to them. It feels like an over engineered CMS.
1 reply →
What does Valve / Steam have to Discord?
In addition to other comments, Steam Chat had significant in-roads with the gaming audience that would eventually form the foundation of Discord. It is quite plausible, had Steam improved chat earlier, that Discord might have never gotten the traction it got.
Nowadays, I find Steam Chat is a ghost town.
Valve only recently implemented a semi Discord-clone(with way better quality voice chat, give it a try some time if you haven't yet).
Their chat system has been famously bad and mostly unchanged since the early 2010's, and only very recently was reworked into this.
1 reply →
I think the parent was implying that steam groups could have improved to the point where Discord would not be necessary.
1 reply →
is it really unbelievable? companies this big tend to prioritize hiring a shit ton of middlemen (VPs, project managers, developer managers, offshore managers) in order to avoid paying out for talent to build and constantly maintain the project. I guess paying a shit ton of money to 1 person to manage 10+ poorly paid contractors works out for them, accounting wise.
If one really examined the accounting for GTAO, I would bet that most of the billions of dollars that were earned in micro transactions went to marketing, product research, and to middle management in the form of bonuses.
Even if you view this as a business decision rather than a technical one, any smart project manager would realise a 6 minute loading time literally costs the company millions per year in lost revenue. (How many times have you felt like firing up GTA Online only to reconsider due to the agonising load time). I would guess this was simply a case of business folk failing to understand that such a technical issue could be so easily solved plus developers never being allowed the opportunity to understand and fix the issue in their spare time.
The insane loading times are literally the exact reason I haven’t played in years. Every time I played I just ended up frustrated and got distracted doing something else while waiting, so I just quit playing altogether. I don’t know how people stand the loading times.
4 replies →
The people who observe the slow loading time already paid for the game, so I guess R* won't lose much revenue because of this nasty bug.
7 replies →
It's kind of hard to believe. GTA5's online mode is their cash cow, and 6 minute load times are common?! It's kind of amazing people even play it with those load times. It's such a simple problem that one dev could have found and fixed it within a day.
It's not at all hard to believe if you've been playing video games for a while.
Everything is getting slower and slower, and nobody cares.
When I played the Atari 2600, I had to wait for the TV to warm up, but otherwise there were no games with anything approaching load times (with 128 bytes of RAM in the console, who would know). The NES didn't have much in the way of load times either, but you did have to fiddle with the cartridge slot. SNES and Genesis didn't usually load (Air Buster being a noticeable exception). CD based systems sure did like to load, but that's somewhat understandable. In the mean time, more and longer boot screens. The Switch uses a cartridge (or system flash/SD cards), but it likes to load forever too.
PC Gaming has had loading for longer, but it's been getting longer and longer.
Some arcade games have lengthy loading sequences, but only when you turn them on while they read their entire storage into RAM so they can be fast for the rest of the time they're on (which in arcades is usually all day).
12 replies →
I am always amused by comments like this. You have no idea what development practices they follow (neither do I) but it's hilarious to read your tone.
GTA has achieved tremendous success both as an entertaining game and as a business. It's enjoyed by millions of people and generates billions in revenue. As per this article, it has startup problems (which don't seem to actually really hurt the overall product but I agree sound annoying) but the bigger picture is: it's a huge success.
So - Rockstar has nailed it. What exactly is your platform for analyzing/criticizing their processes or even having a shot of understanding what they are? What have you build that anyone uses? (not saying you haven't, but.. have you been involved with anything remotely close to that scale?)
And if not, whence he high horse?
You can be right in many places and still wrong in some, and enjoy enormous success as a result of all you have done well. That does not mean nobody can criticize you for something that you have clearly done wrong.
Successful people and businesses can be wrong. You are not making a case for why those development practices are okay, but are simply appealing to authority.
I and most other customers would argue that 6 minute loading times are atrocious, and if there is an easy fix like this, it makes me lose a lot of respect for the developer who doesn’t fix it. It maybe would even make me avoid them in the future.
A reputation is built over years, but can be lost pretty much instantly. Companies have to continue serving their customers to enjoy ongoing success.
2 replies →
I don’t need to be successful to have a platform to be outraged. It doesn’t matter that it’s Rockstar, if anything, the fact that they’re so successful and couldn’t be bothered to save so many people literal hours of their lives in loading time makes it worse.
Why is this getting voted down, are there that many cynical people out there?
1 reply →
GTA is fine ... but the storytelling is meh. Missions keep repeating, and there's little to draw you in. You drive somewhere, somebody gets whacked, you drive back. Rinse and repeat. The makers try compensate with shocking and crass violence and humor, but at some point it just feels kind of juvenile.
Maybe it got better in recent releases, I kind of stopped following after GTA4.
Tell that to Valve. The Source engine and all its games (Half Life 1, 2, Portal, Alyx) have horrible load times. They might not be as bad as the GTA example but they're long and extremely frustrating.
And yet, no one cares, Those games (and GTA5) all sold millions of copies.
The only way this stuff gets fixed is if (a) some programmer takes pride in load times or (b) customers stop buying games with slow load times.
(b) never happens. If the game itself is good then people put up with the load times. If the game is bad and it has bad load times they'll point to the load times as a reason it's bad but the truth is it's the game itself that's bad because plenty of popular games have bad load times
Also, any game programmer loading and parsing text at runtime by definition, doesn't care about load times. If you want fast load times you setup your data so you can load it directly into memory, fix a few pointers and then use it where it is. If you have to parse text or even parse binary and move things around then you've already failed.
I think there may sort of be another thing going on: Basically, that the length of load time is an indicator, "This is a really serious program." I've sort of noticed the same thing with test machines: the more expensive the machine, the longer it takes to actually get to the grub prompt.
Six minutes is probably excessive, but having GTA take 1-2 minutes to load almost certainly makes people feel better about the money they spent on the game than if it loaded up in 5 seconds like some low-production 2D adventure game.
> but having GTA take 1-2 minutes to load almost certainly makes people feel better about the money they spent on the game than if it loaded up in 5 seconds like some low-production 2D adventure game.
Given that it has been the most common criticism of the game since it launched, I don't think anyone views it as a sign of quality.
Do you have a link to how one would archiece this data pointer magic? I wouldnt know what to search for.
In the simplest case, in C you can read file data into memory, cast it as a struct, then just use that struct without ever doing any parsing.
As things get more complex you're probably going to need to manually set some pointers after loading blobs of data and casting them.
It's just the standard way of dealing with binary files in C. I'm not sure what you'd need for search terms.
1 reply →
Just use flatbuffers/capnproto.
1 reply →
It was probably fast 10years ago when the store had couple of items, the dev back then never thought that it would grow to 60k items. Classic programming right there.
As for profiling, Windows Performance Toolkit is the best available no?
Meh. It's ok to assume low number of items and code accordingly. What is not ok is for the company to ignore such a problem for years, instead if detecting and fixing it.
Small nitpick: I believe these are items/prices for the in-game currency, not micro-transactions.
You can buy in-game currency for real world money tho: https://gta.fandom.com/wiki/Cash_Cards
Not 100% sure, never bought anything.
So what you are saying is they are in fact microtransactions.
I've worked a number of places where the engineering culture discourages any sort of fishing expeditions at all, and if I weren't so stubborn everything would take 2-4x as long to run as it does. I say engineering culture, because at 2 of these places everyone was frustrated with engineering because the customers wanted something better, but the engineers would point at flat flame charts, shrug, and say there's nothing that can be done.
Bull. Shit.
There's plenty that can be done because there are parts of a process that don't deserve 15% of the overall budget. The fact that they are taking 1/6 of the time like 5 other things is a failure, not a hallmark of success. Finding 30% worth of improvements with this perspective is easy. 50% often just takes work, but post-discovery much of it is straightforward, if tedious.
My peers are lying with charts to get out of doing "grunt work" when there's a new feature they could be implementing. But performance is a feature.
>It is absolutely unbelievable (and unforgivable) that a cash cow such as GTA V has a problem like this present for over 6 years and it turns out to be something so absolutely simple.
Having played the game, it's not surprising to me in the least.
I have never yet encountered another such 'wild-west' online experience.
It's the only game that is so un-modereated that i've ever played where the common reaction to meeting a hacker that is interested in griefing you is to call your hacker friend white-knight and ask him to boot the griefer-hacker from the lobby.
Reports do next to nothing -- and 'modders' have some very real power in-game, with most fights between 'modders' ending in one of them being booted to desktop by the other exploiting a CTD bug (which are usually chat text-parser based..)
On top of all this, Rockstar attempts to have an in-game economy , even selling money outright to players in the form of 'Shark Cards' for real-life currency , while 'modders' (hackers) easily dupe gold for anyone that may ask in a public lobby.
This isn't just all coincidence; the game lacks any kind of realistic checks/balances with the server for the sake of latency and interoperability -- but this results in every 13 year old passing around the Cheat Engine structs on game-cheating forums and acting like virtual gods while tossing legitimate players around lobbies like ragdolls -- meanwhile Rockstar continues releasing GTA Online content while ignoring playerbase pleas for supervision.
It's truly unique though -- an online battlefield where one can literally watch battles between the metaphorical white hat and black hat hackers; but it's a definite indicator of a poorly ran business when vigilante customers need to replace customer service.
Also, an aside, most 'mod-menus' -- the small applets put together using publicly available memory structs for game exploit -- most all have a 'quick connect' feature that allows hackers to join lobbies much faster than the GTA V client usually allows for. This feature has existed for years and years, and I believe it performs tricks similar to those listed in the article.
>Reports do next to nothing -- and 'modders' have some very real power in-game, with most fights between 'modders' ending in one of them being booted to desktop by the other exploiting a CTD bug (which are usually chat text-parser based..)
Interesting, back in the day the coolest CTD I did was to simply crank up the combo multiplier in S4 League to crash the game client on every player in the room except mine since that game was peer to peer and thus any form of hacking (teleportation, infinite melee range, instant kill, immortality, etc) was possible. The combo multiplier was set to 256 and thus every single particle was duplicated 256 times and this caused the game to crash.
> This online gamemode alone made $1 billion in 2017 alone.
There's the answer right there. They figure it's making $1B/yr, leave it alone. Maintenance? That cuts into the billion. Everyone moved onto the next project.
Or they fix it, see that their "in game time" average drops, and then back it out...
I would not at all be surprised if the long load time made for a sunk cost that kept people playing for longer sessions rather than picking it up for less than half an hour at a time.
3 replies →
You might be onto something here...
I stopped playing GTAV online a few years back because of the crazy load times, not only that but you have to go through 6+ minute load screens multiple times in many sessions.
This oversight has cost them millions of dollars easy.
If it made over $1b in a year previously, and had such insane load times, its very plausible this bad coding has cost them north of another $1b.
Probably ranks pretty highly up there in terms of damage to company financials, due to a lack of care.
In my experience most engineers have never used a profiler even once. They write the code, and if you're lucky they get it working correctly.
Let's call them "code technicians" instead of engineers, ok? (that's a euphemism for "code monkeys")
> the reverse engineered version of GTA III and Vice City
Ohhh. Thank you for telling me about this. I just found a mirror and successfully built it for macOS. Runs so much better than the wine version. But I guess I'll never finish that RC helicopter mission anyway lol
“worth their salt” is doing a lot of work here. No true Scotsman fallacy?
I think you might be surprised by how few programmers even know what a profiler is, let alone how to run one.
That seems like a misapplication of the fallacy. If we assume 'worth their salt' is a synonym for 'good', then saying any good developer can operate a profiler is entirely reasonable.
I used to play this game a lot on PS4. I actually dropped it due to the ridiculous loading times... I still assumed it was doing something useful though. I can't believe they wasted so much of my time and electricity because of this. Even cheaply-made mobile games don't have bugs like this.
> their parent company unjustly DMCA'd re3
Wow, this is EA games level scumbaggery... I don't think I'm gonna buy games from them again.
> salty because their parent company unjustly DMCA'd re3
Unjustly, but legally. The people you should be salty at are the lawmakers.
Why can't I be salty at both? People have responsibility for their actions even if those actions are legal.
That still remains to be seen. A DMCA is not a court order. Anyone can file one and take a repository offline for two weeks.
Yes but the code was clearly derived directly from a decompiled binary; not ‘clean room’ reverse engineering. Hence, illegal, regardless of whether a dmca takedown notice is filed.
6 replies →
> obfuscated executable loaded with anti-cheat measures
I'm impressed that gamecopyworld.com is still online, updated, and has the same UI that it did in 2003
Whoa, what a (refreshing) blast from the past.
Also: Rockstar being too cheap to implement anti-cheat on the by far most successful online shooter on the planet.
Also
> I don’t think there’s any easier way out.
lmfao
This may be obvious, but is GTAV the most successful online shooter on the planet? (Never played it)
I don't think it's obvious. In terms of player count (which is how I would personally rank success), it is not the most successful by a long way: https://en.wikipedia.org/wiki/List_of_most-played_video_game...
However, games like PUBG and Fortnite are free-to-play (PUBG is only free on mobile?) so in terms of actual sales, you could say GTA is more successful. Still not sure I'd class it as an "online shooter", though.
2 replies →
Its the second best selling game of all time[1], and because Minecraft doesn't have guns, I suppose that would qualify it as the most successful online shooter (even if I think another genre would be more applicable).
[1]https://en.wikipedia.org/wiki/List_of_best-selling_video_gam...
5 replies →
Not very surprising. Twitter doesn't work properly on my desktop, google freezes when showing the cookiewall, github freezes my phone. These are all important projects of billion dollar companies.
> It is absolutely unbelievable [...] that a cash cow [...] has a problem like this
Likely it wasn't fixed precisely because it's such a cash cow. "It's making money, don't fuck with it".
Maybe long load times are advantageous? Because it creates longer user sessions on average? If you devote 10 minutes to loading the game you will probably want to play for at least 30 minutes.
Wouldn't the most impatient customers be more likely to pay for items, rather than earn them in game?
Does anyone have a link to a copy of re3? Iirc, there was a gitrepo that kept a copy of all DMCA'd repos
try the hacker news search (bottom of the page) and you'll find stories on the takedown where there are links to backups posted in the comments.
I am not saying it is the case, nor understand details of solution in depth to comment on it, but in analogy, this reads to me, like an yelling at person who figure out how to solve rubix cube puzzle steps, because once steps are known solution is simple.
No, other people have pointed this out, this should have been very easy to recognize as inefficient in the source code. More likely the code was written hastily and only tested against very small inputs, and then nobody ever actually tried to improve the famously long load times.
The sscanf issue was not obvious: it looks linear. And should be, on a better sscanf implementation.
The duplicate checking on the other hand is a classic "accidentally quadratic" case that is obvious.
> This online gamemode alone made $1 billion in 2017 alone.
which of course goes to show that at least from a business side, this issue is completely inconsequential and all resources should be used to push for more monetization (and thus adding to the problem by adding more items to the JSON file) rather than fixing this issue, because, clearly, people don't seem to mind 6 minutes loading time.
I'm being snarky here, yes, but honestly: once you make $1 billion per year with that issue present, do you really think this issue matters at all in reality? Do you think they could make $1+n billion a year with this fixed?
The bigger the scale the bigger a few percentage point improvement would be worth. I would generally think if you're at 1bn in revenue you should devote 1%+ percentage points of your workforce towards finding low hanging fruit like this. If 1% of employees deployed to find issues that, when fixed, yield 2% improvement in revenue thats likely a winning scenario
This is losing them money. If they fixed the issue they absolutely would get $1+n billion instead of just $1 billion and that n alone is big enough to pay multiple years worth of 6 digit salaries just to fix this single bug.
I work in a large multi billion company and we have people staring at a slow problem for a decade before a noob comes with a profiler and find they browse every key of a Map instead of calling get and such. Or do 1 million db queries on a GUI startup...
Not surprised they didn't bother for 6 minutes when it takes us 10 years to fix a 30minutes locked startup.
I find it absolutely believable that a for-profit company does not prioritize fixing a game that is already a cash cow anyway.
> It is absolutely unbelievable (and unforgivable) that a cash cow such as GTA V has a problem like this present for over 6 years
Agree. I found the slow behavior of sscanf while writing one of my first C programs during an internship^^ You literally just have to google "scanf slow" and find lots of information.
It could be that at the time GTA online first publishes, the list as hashmap isn't too much of an issue, due to limited catalog, but get progressively worse as the inventory grows.
Ofc this is just a hypothesis, but I see the hesitation to change legacy code if it ain't broken as a wide spread mentality.
> I see the hesitation to change legacy code if it ain't broken as a wide spread mentality.
Load times measured in double digit minutes on a significant number of machines meets absolutely every reasonable definition of "broken".
Maybe it was outsourced. I don't understand how a team could make such an excellent game and fail to resolve such a simple bottleneck.
Focus is on the money, not on the technicals, on the production side. It is a game, to entertain and ultimately “waste time” on the consumer side. Also on the topic of parsing, +1-11gbyte/sec is possible if you go binary. Point is: this isn’t a technical problem, it’s a choice.
Can you quantify the additional profit they would have made if this were fixed N years ago?
Well this sprint we didn't release any new features but we reduced the load.... Dammit hackernews!
What is complicated about it is that an online modern 3d game is huge, and there are 95,000 places where a dumb mistake could hurt performance a lot for some customers. You catch 94,999 of them and then "unforgiveable"
If it was that way for a few months and then fixed... still pretty shoddy but sure. However, it has been that way for YEARS, and is one of the most common complaints among players. I wonder how much of the remaining loading time could actually be shaved off if someone with the source code took a crack at it.
> It is absolutely unbelievable (and unforgivable) that a cash cow such as GTA V has a problem like this present for over 6 years and it turns out to be something so absolutely simple.
It is both believable and - by virtue of the fact that, as you said, the series continues to be a cash cow - is apparently forgivable.
Here's the thing: the company has zero reasons to fix this, or other ostensibly egregious affronts like DRM, because gamers keep buying the product. There is literally no economic incentive to 'fix' it.
How many players have they lost to excessive load times?
> How many players have they lost to excessive load times?
Judging by the number of successful sequels the franchise has spawned, the answer is 'an insignificant number'.
The fact of the matter, and the point my comment was trying to make, is that the overwhelming majority of players do not sufficiently care about load times or other complaints to deter them from paying for the game. That is the reality of the situation.
1 reply →
I've lost interest in a lot of online games because the clients always do their mandatory updates on launch, which is exactly the time you want to play the game.
(Same thing for websites - they show you all the annoying popup signup sheets/survey questions the instant you load the page.)
1 reply →
Attitudes like yours are why gamedevs keep to themselves.
"Unbelievable" and "unforgivable" eh? It's a greedy attitude. Instead of viewing GTA5 as a success that's brought a lot of people happiness, you view it as a money cow designed to extract every last bit of profit – and time, since this bug caused 70% longer loading times.
Perhaps it's both. But you, sitting here behind a keyboard with (correct me if I'm wrong) no gamedev experience, have no idea what it's like on a triple-A gamedev team with various priorities. The fact that the game works at all is a minor miracle, given the sheer complexity of the entire codebase.
The fact that someone was able to optimize the obfuscated executable is a wonderful thing. But they weren't a part of the team that shipped GTA 5. If they were, they certainly wouldn't have been able to spend their time on this.
This kind of excuse making is one of the reasons I got out of software development. It’s not just gamedev. Priorities are way out of wack when you have time to put in binary obfuscation, but no time to fix such a huge performance bottleneck. The idea that “it’s a miracle software works at all” demonstrates the chronic prioritization and project management competence problem in the industry.
It’s ok to recognize a thing as a business success but a technical failure. In fact many software projects are business successes despite awful and unforgivable quality compromises. You don’t get to whitewash it just because the thing prints money.
How do we then address chronic incompetence? Never complain about it?
This is not small. This kind of incompetency if employed in a different sector such as security would lead to losing personal data of millions.
> “it’s a miracle software works at all”
This is not the case here. Please re-evaluate your calibration on this topic.
2 replies →
If loading times were prioritized, features would be cut. Which features would you cut out of the game in order to have fast loading times?
This is what you'd need to decide. And then afterwards, it might not print as much money as you think it will.
It's easy looking at it from the outside. Not so easy from the inside.
12 replies →
GTA-5 broke even within 48 hours of it's release. Nearly a decade later, it still costs $60 for a digital copy with (practically) zero distribution costs. It has made over $6Bn in revenue, and is said to be the most profitable entertainment product of all time.
How much would it have cost to fix this issue?
Is anyone saying that it is a game developers fault? I mean, what is that you think would prevent a game developer from fixing this?
Because I think, anyone even vaguely familiar with the software industry in general is going to come up with answers like:
1. It would not cost very much 2. No it isn't a developers fault, because it's clear that even an intern could fix this 3. Management isn't interested, or is too disorganized, or focussed on cynical extraction of every last bit of profit.
And from that perspective, it certainly does make it seem like a cynical cash cow.
I don't know many game developers, but I do know people in other parts of the software industry and professionals in general. And I think that they keep to themselves because they have first hand experience of how the industry works and understand it better than anyone. The probably sympathise with the right of the public to feel ripped off.
That said, I still paid for the game, I think it's fun. Apparently there is "no alternative" to this state of affairs.
> GTA-5 broke even within 48 hours of it's release. Nearly a decade later, it still costs $60 for a digital copy with (practically) zero distribution costs
Well, that's the nominal full retail price against which the various discounts are measured, sure, but I doubt that's what most people buying it these days pay except if they are getting something else with it. I'm pretty sure it's in the stack of things I've gotten free this year on Epic that's in my “I might check it out sometime” queue, it's $29.98 right now from Humble Bundle, etc.
> Nearly a decade later, it still costs $60 for a digital copy
It's actually only $30 and frequently goes on sale for $15. It hasn't been $60 (on Steam at least) since June 2018.
I don’t think the OP was specifically calling out any game devs. Any engineer who has worked on any software projects knows that you usually can only do as well as your deadlines and managements priorities allow for.. Unless the stars line up and you have the resources and autonomy to fix the issue yourself, and love working for free on your own time.
Unfortunately, they were calling out gamedevs:
Tweaking two functions to go from a load time of 6 minutes to less than two minutes is something any developer worth their salt should be able to do in a codebase like this equipped with a good profiler.
But, I fully agree with your assessment, for what it's worth.
1 reply →
I did mean to reply to you (72 days ago plus in another thread) I just can't make a Twitter, like I've tried and doesn't like my IP or email addresses or something, not sure
Hmm, that's unfortunate. Well, I hope you find a way to get on Twitter. Your thoughts are always welcome, and there are a lot of interesting people in the ML scene there.