Comment by tw04

5 years ago

You have just described why I laugh anytime someone complains that government is inefficient. ANY organization of sufficient size is "inefficient" because what a large organization optimizes for (for reasons I cannot explain) cannot align with what that organization's customers want optimized.

With the added difference that governments also have to be far more procedural by virtue of the way they are set up. Regardless of size they are accountable and responsible to a far higher degree in the eyes of the population they represent so there is a legitimate reason to be "slow".

In games the added reason to be slow is that game code is by definition some of the least mission critical code one could find (competes with 90% of the internet web code). Your Linux or Windows code might run a hospital's infrastructure or a rover on another planet. A game on the other hand can launch with bugs the size of your windshield, and can stay like that forever as long as people still pay. And people will pay because games are not unlike a drug for many people.

As such most game coding teams and coders are "trained" to cut every corner and skimp on every precaution. They're not needed beyond a very low baseline as far as software is concerned.

Look at the amount of bugs or cheats incredibly popular games like GTA or CoD have. These are billion dollar a year franchises that leave all this crap on the table despite all the money they make. They have all the resources needed, it's a conscious call to proceed like this, to hire teams that will never be qualified enough to deliver a high quality product and will be encouraged to cut corners on top of that.

Source: a long time ago I worked for a major game developer in a senior management role (unrelated to the dev activity) and left after feeling like "facepalm" for too long in every single SM meeting.

  • Remember this story from a few days ago?

    https://randomascii.wordpress.com/2021/02/16/arranging-invis...

    I've seen plenty of interesting bugs, best I found personally was a compiler that was outputting files 1 byte at a time.

    Games are riff with these sorts of bugs, but the volume of released games vs other types of software makes any sort of comparison unfair.

    • > but the volume of released games vs other types of software makes any sort of comparison unfair.

      No comparison is entirely fair but I think your objection is unfounded. The quantity of games being released is irrelevant when we're talking about the quality of their code. Which is really bad for most big titles.

      If anything the comparison was unfair in the other direction. Sure, an OS or browser have a lot of bugs. But if games and their code had a fraction of the scrutiny something like Windows gets you might just find out that 4 lines of code were written "by the book". It's something any honest game dev will confirm, game code is a stack of spaghetti code on top of more spaghetti code. The philosophy is that you can just go ahead with bad code because you can always fix it with a patch later on. Then you notice there's no widespread pushback from gamers (because unless it's an absolute bomb there won't be) and move on with the next round of "features", nobody has time for fixing bugs or combing the spaghetti.

      One other problem is that eventually some people coming from the gaming industry will end up switching to other types of software development but will stick to the philosophy. I've done a lot of hiring over my career and of one thing I'm certain. Whenever someone came with most of their career in game development or most of the recent experience I asked for the CV to be put at the bottom of the stack. It was a lesson I learned the hard way.

      Issues like the low criticality of game code, the "crunch" work style, the idea that you should just get it out there as quickly as possible, the lack of serious scrutiny into the matter, etc. all compound each other to create a coding (and coordination) style that's hard to shake off.

      1 reply →

> for reasons I cannot explain

Any sufficiently large institution, over time, will prioritise self-preservation over achieving their core mission. This is sociology 101. Once a company has enough users to make it hard or impossible to measure immediate performance, self-preservation is achieved with internal manoeuvering and selling to execs.

  • Whats the remedy? apart from reducing the size of the organisation

    • It's not a perfect remedy, but you have to loop in the people affected by decisions as part of the decision making structure. That is, for example, customers and workers have to be part of the management structure.

      This doesn't happen because it would reduce the power of top decision makers and potentially impact profits. e.g. a customer might ask for a chronologically ordered timeline on Facebook, but that would harsh engagement metrics, revenue, etc. If stuff like this did happen more often though, you'd get products and services that more often achieve their stated aims.

      1 reply →

    • Even if you take engineering you see that adding more links on a chain increases latency and demands more auxiliary circuitry. But at least in engineering you can design each part to do what you want it to do close to perfection. And it will scale better because you can build everything to spec and bin, which is why we automate a lot of tasks. With humans that can't happen. Human "binning" is a constantly moving target.

      After tangentially working on this for a long time I'd say that the core issue is so deeply ingrained in human psyche that it may not even be a matter of education (starts early), let alone organization (starts happening when everything else is "set in stone"). There's no organizational structure that fits the types of activities we humans tend to do these days and that can deliver low latency, consistent results at scale. We mitigate issues in one area by accentuating them in others.

      You can have one large flat structure but the load on the single coordinating circuit (the manager) will compromise quality. You can split the thing in multiple, individually coordinated units but the added layer of coordination and latency will compromise performance.

      Maybe some form of general purpose but not quite full AI, something that combines human like intelligence and engineering like consistency, might be able to do what humans are supposed to but without the variability of humans (which is both good and bad).

      2 replies →

    • There was a fantastic discussion some years ago on ways to design an organization to minimize the tendency to drift towards self-preservation instead of remaining customer-focused.

      The HN discussion[1] was started by an article that provided numbers that seemed to suggest that Wikipedia's spending was slowly spiraling out of control.

      1: https://news.ycombinator.com/item?id=14287235

      4 replies →

    • Many people would say "more accountability" but I've seen that used successfully to deflect lightning strikes to innocent people who were then fired so... I'd like to know as well.

    • Competition/choice, which means that self-preservation requires that they care about efficiency. It obviously that wasn't enough here, but definitely tames some inefficiencies.

Organizations are like spheres. Only a small part of the sphere has an exposed surface that is in contact with the outside world. As you grow the sphere most of the mass will be inside the sphere, not near the surface.

And I laugh every time people claim that it is only governments that can be inefficient. Most large commercial companies are inefficient and almost not functioning at all.