Comment by jeremiep

8 years ago

I have yet to have any GC issues in a Unity project over 5 years of using it, even larger scale ones. But I would agree I rarely write "idiomatic" C#, I don't like many things about that style of programming in the first place to be honest.

A garbage collector is rarely the issue; poor development practices and not planning memory usage patterns in the architecture will cost you orders of magnitude of performance long before the GC itself is an issue.

Its not even hard to have zero GC allocations on ~99% of frames. This will give you better performance than the best GC implementations over bad allocation patterns ever could. This is also true of Unreal (UObjects are GC'd as you mentioned) and literally everything else using a GC; its not a free pass to forget about memory management.

Thats one of the primary reasons why I rarely use code assets from the Asset Store; most people don't know how to write performant C# whatsoever; they get killed by a thousand cuts but none of them show up in the profiler, effectively wasting 90% of the CPU without any warning signs. (This is true of almost every software ever released.)

I'm not one for premature micro-optimizations, but on the other hand I don't see how you can get any kind of performance without accounting for macro-optimizations from the very beginning.

Blaming the GC for poor performance is pretty much the same as doing unbuffered byte-by-byte I/O and then wondering why its 1000x slower than the competition even after weeks of micro-optimizations.

> Workarounds exist. That's great. I don't care. Workarounds don't address the default experience of writing idiomatic C#.

I blame Unity's GC for what it is bad at. It's both stop-the-world and non-generational. As I said down-thread, it's a lot of work to design for avoiding allocations. Even then, Unity will still GC from time to time, and whether it matters will depend on the characteristics of your game's managed heap as a whole. This is the major stumbling block most times people argue with me about how GC is fine. Your data is like your data, because your app is your app. It's not like my data. The next step is not to use the managed heap for anything that isn't a temporary. Again, quite a bit of work. And the beginners and intermediates who are most of Unity's users aren't going to do it. Hell, in general I consider GC a non-starter for games, but sometimes I want to work at a company that uses C#. So I repeat:

> Workarounds exist. That's great. I don't care. Workarounds don't address the default experience of writing idiomatic C#.

  • Blaming the GC for poor memory performance isn't the issue.

    I still wouldn't call that a workaround but rather properly using memory in the first place; you'll get GC pressure even on a concurrent generational GC if you constantly push garbage down its throat. You'll get slowdowns even on reference counting if you naively assume it to be faster than a GC then blindly pass references all over the place causing tons of increments/decrements on the refcount; trashing caches in the process and causing tons of branch mispredictions.

    I've worked on a LOT of radically different games; I don't believe the specific game, genre, engine or language changes how you manage memory, at all. I don't even believe its more work when you account for all the debugging and profiling time you save by the end of the project; a data-driven approach is almost always simpler. Heck Unity is even moving towards a new Entity-Component-System that explicitly relies on the user managing their memory (and in batches too!), with massive performance gains, same for the Job system currently in the 2018.1 beta. We're talking gains of orders of magnitude in performance, not the tiny ~10% gains you'd get profiling and micro-optimizing at the end of a project.

    Beginners and intermediates don't manage memory on large scale Unity projects, and memory management isn't nearly as important on smaller ones; they can waste 80% of hardware resources and still hit 60FPS, computers are that fast. So I don't think that's a valid point. Also saying my data is different than your data may be true, but that has no impact on how memory management works whatsoever. Just like you don't stop using SQL because a different app has different relations.

    Its almost impossible to get any kind of performance without accounting for memory allocations and layouts, no matter the engine/language you pick. You're bound to waste 90% of the hardware while being left profiling within the remaining 10% from the very second you stop thinking about memory. I really dont get this "again, quite a bit of work" you mention; idiomatic C# is literally more work than data-driven C# at a fraction of its performance and simplicity.

    I've seen death by a thousand cuts "we enter crunch mode for months because the game is dog slow before shipping because nobody cared about architecture for 10+ months" kill many game projects; they all had fancy idiomatic code, everything looked good in the profiler yet the problems were systematic. Its pretty much being Agile without having any actual agility.

    So I repeat: blaming the GC for poor memory performance isn't the issue. I'll bet I find tons of complexity issues long, long before I see GC issues in any codebase; that's the real issue. Its not that the GC is one huge slowdown, its that the codebase is littered with tiny slowdowns all over the place, so tiny they're effectively invisible to the profiler, yet their sum is still greater than that of the GC.

    • Dude, you are being extremely arrogant here, like what you've seen and what you know are universals. I've already violated my own policy of never arguing with other people about GC in games, so serves me right. Peace out.