Comment by psyc
8 years ago
As a fulltime Unity dev, this doesn't matter much to me, because it doesn't address the major pain point for larger Unity games. This .NET upgrade does not include a modern garbage collector, and the garbage collector update has been "maybe next year, but definitely not now, because it's hard" for as long as I've been using Unity.
Workarounds exist. That's great. I don't care. Workarounds don't address the default experience of writing idiomatic C#.
I've considered switching to Unreal, but that has GC built into it a little too much. I know nothing about its characteristics, but it makes me wary. This thread makes me want to investigate Godot. Can anyone provide a one-sentence summary of the GC situation there?
I have yet to have any GC issues in a Unity project over 5 years of using it, even larger scale ones. But I would agree I rarely write "idiomatic" C#, I don't like many things about that style of programming in the first place to be honest.
A garbage collector is rarely the issue; poor development practices and not planning memory usage patterns in the architecture will cost you orders of magnitude of performance long before the GC itself is an issue.
Its not even hard to have zero GC allocations on ~99% of frames. This will give you better performance than the best GC implementations over bad allocation patterns ever could. This is also true of Unreal (UObjects are GC'd as you mentioned) and literally everything else using a GC; its not a free pass to forget about memory management.
Thats one of the primary reasons why I rarely use code assets from the Asset Store; most people don't know how to write performant C# whatsoever; they get killed by a thousand cuts but none of them show up in the profiler, effectively wasting 90% of the CPU without any warning signs. (This is true of almost every software ever released.)
I'm not one for premature micro-optimizations, but on the other hand I don't see how you can get any kind of performance without accounting for macro-optimizations from the very beginning.
Blaming the GC for poor performance is pretty much the same as doing unbuffered byte-by-byte I/O and then wondering why its 1000x slower than the competition even after weeks of micro-optimizations.
> Workarounds exist. That's great. I don't care. Workarounds don't address the default experience of writing idiomatic C#.
I blame Unity's GC for what it is bad at. It's both stop-the-world and non-generational. As I said down-thread, it's a lot of work to design for avoiding allocations. Even then, Unity will still GC from time to time, and whether it matters will depend on the characteristics of your game's managed heap as a whole. This is the major stumbling block most times people argue with me about how GC is fine. Your data is like your data, because your app is your app. It's not like my data. The next step is not to use the managed heap for anything that isn't a temporary. Again, quite a bit of work. And the beginners and intermediates who are most of Unity's users aren't going to do it. Hell, in general I consider GC a non-starter for games, but sometimes I want to work at a company that uses C#. So I repeat:
> Workarounds exist. That's great. I don't care. Workarounds don't address the default experience of writing idiomatic C#.
Blaming the GC for poor memory performance isn't the issue.
I still wouldn't call that a workaround but rather properly using memory in the first place; you'll get GC pressure even on a concurrent generational GC if you constantly push garbage down its throat. You'll get slowdowns even on reference counting if you naively assume it to be faster than a GC then blindly pass references all over the place causing tons of increments/decrements on the refcount; trashing caches in the process and causing tons of branch mispredictions.
I've worked on a LOT of radically different games; I don't believe the specific game, genre, engine or language changes how you manage memory, at all. I don't even believe its more work when you account for all the debugging and profiling time you save by the end of the project; a data-driven approach is almost always simpler. Heck Unity is even moving towards a new Entity-Component-System that explicitly relies on the user managing their memory (and in batches too!), with massive performance gains, same for the Job system currently in the 2018.1 beta. We're talking gains of orders of magnitude in performance, not the tiny ~10% gains you'd get profiling and micro-optimizing at the end of a project.
Beginners and intermediates don't manage memory on large scale Unity projects, and memory management isn't nearly as important on smaller ones; they can waste 80% of hardware resources and still hit 60FPS, computers are that fast. So I don't think that's a valid point. Also saying my data is different than your data may be true, but that has no impact on how memory management works whatsoever. Just like you don't stop using SQL because a different app has different relations.
Its almost impossible to get any kind of performance without accounting for memory allocations and layouts, no matter the engine/language you pick. You're bound to waste 90% of the hardware while being left profiling within the remaining 10% from the very second you stop thinking about memory. I really dont get this "again, quite a bit of work" you mention; idiomatic C# is literally more work than data-driven C# at a fraction of its performance and simplicity.
I've seen death by a thousand cuts "we enter crunch mode for months because the game is dog slow before shipping because nobody cared about architecture for 10+ months" kill many game projects; they all had fancy idiomatic code, everything looked good in the profiler yet the problems were systematic. Its pretty much being Agile without having any actual agility.
So I repeat: blaming the GC for poor memory performance isn't the issue. I'll bet I find tons of complexity issues long, long before I see GC issues in any codebase; that's the real issue. Its not that the GC is one huge slowdown, its that the codebase is littered with tiny slowdowns all over the place, so tiny they're effectively invisible to the profiler, yet their sum is still greater than that of the GC.
1 reply →
Sure!
Hopefully a more experienced Godot dev will jump in and correct me if I'm wrong here..
Godot uses reference counting, but additionally supports a function which you can call on an object which will cause it to become 'unmanaged' by the reference counter, allowing you to free its memory on your own terms (or leak the memory, if you're not careful).
Godot 3.0 added support for Mono, but I'm not sure how that works. I'm guessing that game objects in Godot are still reference counted, but now you also have memory allocated by the Mono runtime which will be garbage collected.
Ironically, no amount of care towards memory management could approach even one tenth of the care and attention I'm required to devote to memory management in Unity (and, FWIW, also in actual Windows .NET) to avoid frequent dropped frames. This is exactly what I was hoping to hear, thanks!
Yep, I abandoned Unity after encountering the same issues as you. My experience is that every Unity GC run takes 10+ ms on my machine, even when there is no memory to clean up.
I tried MonoGame once and the results were much better--garbage collection was fast when there wasn't much to do. However, I'd still choose a non-garbage-collected engine any day because I dislike lag spikes.
While I hear generally about garbage collection causing dropped frames. I've had this happen to me in HTML5 land. I haven't heard about it as much with Unity, that C# was fast enough, and the under the hood pieces are C++ anyways. Perhaps I'm reading the wrong channels :). I'm curious if your game project is 3D, and what sort of intensity you tend to have with it.
1 reply →
From: http://docs.godotengine.org/en/3.0/getting_started/scripting...
' Memory management
If a class inherits from Reference, then instances will be freed when no longer in use. No garbage collector exists, just simple reference counting. By default, all classes that don’t define inheritance extend Reference. If this is not desired, then a class must inherit Object manually and must call instance.free(). To avoid reference cycles that can’t be freed, a weakref function is provided for creating weak references. '
Reference counting is a form of garbage collection. And depending on your usage patterns it might not even be the fastest one.
Lots of increments/decrements on the refcount interleaved in normal code can kill the gains over a traditional GC that has nearly free allocation, batched finalizations and doesn't pollute the instruction stream with increments/decrements.
Also with a traditional GC you pay nothing if you don't allocate memory; the collector will never run. You still pay the full price of reference counting no matter if you're done allocating or not.
Unreal doesn't have GC as far as I know. Everything is built on C++ objects. There's some reference counting here and there, but I've not yet run into problems with it.
https://docs.unrealengine.com/en-us/Programming/UnrealArchit...
I know very little about it, but in general there's a lot more information in the docs about the Garbage Collector than there is about how to not use it.
TIL (thanks for the correction!) I guess they do this to break reference loops. The difference I guess is that it only applies to UObject derived things, which are higher level objects, rather than across your entire codebase. I’ve never seen GC take up processor time, but our project doesn’t have a lot of runtime churn of objects.
Unreal does use GC and it is the stop the world variety, yet unlike unity the GC has never even showed up for us as a problem. I cannot tell you why or how but Unreal just handles GC better than Unity. I am speaking from my experience of shipping a commercial game on console with Unreal. Maybe things are worse on mobile.
2 replies →