Comment by AndrewThrowaway

4 years ago

A bit out of topic but has anybody followed performance issues of Microsoft Flight Simulator 2020? For more than half a year it was struggling with performance because it was CPU heavy, only loading one core and etc. Barely ran on my i5 6500. Fast forward half a year, they want to release it on XBox. MS/Asobo moves a lot of computation on GPU, game starts running smoothly on the very same i5 with maximized quality settings.

You just begin to wonder how these things happen. You would think top programmers work at these companies. Why would they not start with the good concept, loading GPU first etc. Why did it take them so much time to finally do it correctly. Why waste time not doing it at the beginning.

It's pretty straightforward case of prioritization. There are always more things to do on a game project than you have people and time to do.

The game runs well enough so the people who could optimize things by rewriting them from CPU to GPU are doing other things instead. Later performance is a noticeable problem to the dev team, from customer feedback and need to ship in more resource constrained environments (VR and XBox) and that person then can do work to improve performance.

It's also handy to have a reference CPU implementation both to get your head around a problem and because debugging on the GPU is extremely painful.

To go further down the rabbit hole it could be that they were resource constrained on the GPU and couldn't shift work there until other optimizations had been made. And so on with dependencies to getting a piece of work done on a complex project.

  • Makes sense and then it kinda agrees with parent comment that "Microsoft developers simply didn't have performance in the vocabulary".

    Yes, there is no doubt that "there are always more things to do on a game project than you have people and time to do". However how there is time to firstly make "main thread on single core" monster and then redo it according to game development 101 - make use of GPU.

    It is no joke - GPU was barely loaded, while CPU choking. On a modern game released by top software company proudly presenting their branding in the very title.

    • > However how there is time to firstly make "main thread on single core" monster and then redo it according to game development 101 - make use of GPU.

      The single threaded code was taken from Microsoft Flight Simulator X, the previous version of this game from 2006. It was not done from scratch for the new game, and it still hasn't been replaced. They've just optimized parts of it now.

      Another important performance bottleneck is due to the fact that the game's UI is basically Electron. That's right, they're running a full blown browser on top of the rest of the graphis for the UI. Browsers are notoriusly slow because they support a million edge cases that aren't always needed in a game UI.

      For anyone interested in learning more about Microsoft Flight Simulator 2020 optimizations I can recommend checking out the Digital Foundry interview with the game's technical director. https://www.youtube.com/watch?v=32abllGf77g

    • It was actually made by a third-party game development studio rather than Microsoft.

      Also the assumption that the culture of the Windows Terminal team is the same as the team building a flight simulator is a bit far fetched. Large organisations typically have very specific local cultures.

      Rewriting stuff from CPU to GPU also isn't 101 game development. Both because it's actually quite hard and because not all problems are amenable to GPU compute and would actually slow things down.

    • I work within game development. Mainly on graphics programming and performance. There's always a number of things I know would speed various systems up but that I don't have time to implement because there are so many bigger fires to put out.

      Also, I have rewritten CPU implementations to run on the GPU before and it's often nontrivial. Sometimes even in seemingly straightforward cases. The architectures are very different, you only gain speed if you can properly parallelize the computations, and a lot of things which are trivial on the CPU becomes a real hassle on the GPU.

It may sound oversimplified, but IME PC games are only optimized to the point where it runs well on the development teams beefy PCs (or in the best case some artificial 'minimal requirement PC with graphics details turned to low', but this minimal requirement setup usually isn't taken very seriously by the development team).

When porting to a game console you can't simply define some random 'minimal requirement' hardware, because the game console is that hardware. So you start looking for more optimization opportunities to make the game run smoothly on the new target hardware, and some of those optimizations may also make the PC version better.

  • I'd like to second this and add that this is partly why I feel it's important to develop games directly for all your target platforms rather than target one or two consoles and then port the game.

Because a rule of thumb is to not focus too much on performance in the beginning of a project. Better a completed project with some performance issues, than a half product with hyper speed. The key thing with development is to find somekind of balance within all these attributes (stability, performance, loo, reusability etc) In case of FS simulator. Not sure what the motives were. Sure that they had some serious time constraints. I think they did an acceptable job there.

  • Agreed. As someone who spends most of their time on performance related issues, it's important to keep in mind that sometimes performance issues are strongly tied to the architecture of the program. If the data structures aren't set up to take advantage of the hardware, you'll never get the results you're hoping for. And setting up data structures is often something that needs to be thought about at the beginning of the project.

  • Completely agree on rule of thumb and can't doubt they had their motives. It is no way that simple.

    Then again isn't it like 101 of game development?

    Imagine releasing a game that looks stunning, industry agrees that it pushes the limits of modern gaming PC (hence runs poorly on old machines). Fast forward some time - "oh BTW we did it incorrectly (we had our motives), now you can run it on old machine just fine, no need of buying i9 or anything".

    • Because it's never that obvious, it's not really 101 of gamedev to move everything to the GPU.

      You know your target specs but where the bottlenecks are and how to solve them will be constantly shifting. At some point the artists might push your lighting or maybe it's physics now, maybe it's IO or maybe networking. Which parts do you move to the GPU?

      Also a GPU is not a magic bullet, it's great for parallel computation but not all problems can be solved like that. It's also painful to move memory between the CPU and the GPU and it's a limited resource can't have everything there.

I'll reiterate a rant about Flight Simulator 2020 here because it's on-topic.

It was called "download simulator" by some, because even the initial installation phase was poorly optimised.

But merely calling it "poorly optimised" isn't really sufficient to get the point across.

It wasn't "poor", or "suboptimal". It was literally as bad as possible without deliberately trying to add useless slowdown code.

The best equivalent I can use is Trivial FTP (TFPT). It's used only in the most resource-constrained environments where even buffering a few kilobytes is out of the question. Embedded microcontrollers in NIC boot ROMs, that kind of thing. It's literally maximally slow. It ping-pongs for every block, and it uses small blocks by default. If you do anything to a network protocol at all, it's a step up from TFTP. (Just adding a few packets worth of buffering and windowing dramatically speeds it up, and this enhancement thankfully did make it into a recent update of the standard.)

People get bogged down in these discussions around which alternate solution is more optimal. They're typically arguing over which part of a Pareto frontier they think is most applicable. But TFTP and Microsoft FS2020 aren't on the Pareto frontier. They're in the exact corner of the diagram, where there is no curve. They're at a singularity: the maximally suboptimal point (0,0).

This line of thinking is similar to the "Toward a Science of Morality" by the famous atheist Sam Harris. He starts with a definition of "maximum badness", and defines "good" as the direction away from it in the solution space. Theists and atheists don't necessarily have to agree with the specific high-dimensional solution vector, but they have to agree that that there is an origin, otherwise there's no meaningful discussion possible.

Microsoft Terminal wasn't at (0,0) but it was close. Doing hilariously trivial "optimisations" would allow you to move very much further in the solution space towards the frontier.

The Microsoft Terminal developers (mistakenly) assumed that they were already at the Pareto frontier, and that the people that opened the Github Issue were asking them to move the frontier. That does usually require research!

  • I though that it looks more like 'no one touches it at all after function part finished'. It is insane that someone still do decompress and downloading in the same thread and blocks download during unzip the resource in 2021. Even 2000's bash programmer knows you had better don't do them in order or it will be slow...

    • It also downloads hundreds of thousands of tiny files over a high-latency connection.

      At least initially, it didn't use the XBox CDN either.

The damn downloader in MSFS is the most infuriating thing. In Canada on either of the main ISPs I too out at 40ish Mbps whereas Steam and anything else really does close to the full 500Mbps. It also only downloads sequentially, pausing to decrypt each tiny file. And the updates are huge so it takes a good long while to download 2+GB.

I followed MSFS performance pretty closely since before FS5.

What's happening is this:

- the FS developers are using the fastest possible machines, beyond what non-overclockers can get

- the developers refused to prioritize the tile stuttering problem evident in all versions. Ignoring that for over 30 years is just plain wilful unless ...

- I'm pretty sure the DoD funded FS (hence the non-consumer license for Pr3pared), and I think there was a requirement for performance as a "consumer game" to not be too perfect, hence the tile stuttering.

This is even more likely since Jane's had full-page ads for military sims using the FS engine back in the day. IOW, FS was too killer for its own good.

(I actually did my Instrument rating practical preparation on FS5, and passed in the first attempt at a Class B airport. I don't think DoD liked where the public FS game was headed, especially FS 2000.)