← Back to context

Comment by pc2g4d

9 months ago

When I set out to learn Rust about a decade ago, I chose to write a game - a clone of "Empire" that I call Umpire.

It's a different task to re-implement an already-designed language rather than designing and implementing at the same time. Nevertheless I have run into a number of the difficulties mentioned in the article, and arrived at my own solutions - foremost passing around global UUIDs rather than actual `&` references, and enforcing existence constraints at runtime.

I've experienced the protracted pain of major refactors when assumptions baked into my data model proved false.

In some regards these refactors wore some of the shine off of Rust for me as well. BUT I'm still glad the game is implemented in Rust, exactly because of Rust's dual emphasis on safety and performance.

The AI I'm developing requires generation of massive quantities of self-play data. That the engine is as fast as it is helps greatly.

Rust's strength in ML means my AI training and game code can share important types, ensuring consistency.

The effectiveness of Rust for writing CLI tools (mentioned in the article) has lent itself to a number of game-specific command-line interfaces that are of high quality.

Rust's memory safety became critical once I decided to network the game. I don't want `umpired` to be any more exploitable than it needs to be.

My constraints have been very different than the OP's; obviously it makes sense for their studio given their experience to move away from Rust. But I think Rust still has a place in games.

* https://en.wikipedia.org/wiki/Empire:_Wargame_of_the_Century * https://github.com/joshhansen/Umpire

> Rust's strength in ML

Most of ML frameworks that I know are implemented in Python and C++. I tried looking at ML in Rust a few years ago and didn't find anything useful. Has it changed?

  • You can use libtorch directly via `tch-rs`, and at present I'm porting over to Burn (see https://burn.dev) which appears incredibly promising. My impression is it's in a good place, if of course not close to the ecosystem of Python/C++. At very least I've gotten my nn models training and running without too much difficulty. (I'm moving to Burn for the thread safety - their `Tensor` impl is `Sync` - libtorch doesn't have such a guarantee.)

    Burn has Candle as one of its backends, which I understand is also quite popular.