← Back to context

Comment by WhyNotHugo

3 months ago

> > The lower levels are buggy and have a lot of churn > > The stack I use is Rend3/Egui/Winit/Wgpu/Vulkan

The same is true if you try to make GUI applications in Rust. All the toolkits have lots of quirky bugs and broken features.

The barrier to contributing to toolkits is usually also pretty high too: most of them focus on supporting a variety of open source and proprietary platforms. If you want to improve on something which requires some API change, you need to understand the details of all the other platforms — you can't just make a change for a single one.

Ultimately, cross-platform toolkits always offer a lowest common denominator (or "the worst of all worlds"), so I think that this common focus in the Rust ecosystem of "make everything run everywhere" ends up being a burden for the ecosystem.

> > Back-references are difficult > > A owns B, and B can find A, is a frequently needed pattern, and one that's hard to do in Rust. It can be done with Rc and Arc, but it's a bit unwieldy to set up and adds run-time overhead.

When I code Rust, I'm always hesitant to use an Arc because it adds an overhead. But if I then go and code in Python, Java or C#, pretty much all objects have the overhead of an Arc. It's just implicit so we forget about it.

We really need to be more liberal in our usage of Arc and stop seeing it as "it has overhead". Any higher level language has the same overhead, it's just not declared explicitly.

Arc is a very slow and primitive tool compared to a GC. If you are writing Arc everywhere, you would probably have better performance switching to a JVM language, C#, or Go.

  • This is incorrect if you are using Rc exclusively for back references. Since the back reference is weak, the reference count is only incremented once when you are creating the datatype. The problem isn't that it's slow, it's that it consumes extra memory for book keeping.

  • I warned that one extreme (being afraid to use Arc is necessary) is bad.

    I agree with you: the other extreme (using Arc everywhere) is also bad.

    There's a sweet middle spot of using it just when strictly necessary.

Objects are cheaper than Arc<T>. Otherwise using GC would suck a lot more than it does today (for certain types of data structures like trees accessed concurrently it is also a massive optimization).

Python also has incomparably worse performance than Java or C#, both of which can do many object-based optimizations and optimize away their allocation.

The "if I then go and code in Python, Java or C#, pretty much all objects have the overhead of an Arc" is not accurate. Rust Arc involves atomic operation and its preformance can greatly degrade when the reference count is being mutated by many threads. See https://pkolaczk.github.io/server-slower-than-a-laptop/

Java, C# and Go don't use atomic reference counting and don't have such overhead.