Comment by pmarreck

1 day ago

My takeaway, speaking as someone who leans towards functional programming and immutability, is "this is yet another example of a mutability problem that could never happen in a functional context"

(so, for example, this bug would have never been created by Rust unless it was deeply misused)

This is more of a problem of the C/C++ standard that it allows uninitialized variables but doesn't give them defined values, considering it "undefined behavior" to read from an uninitialized variable. Java, for example, doesn't have this particular problem because it does specify default values for variables.

  • But it's this and many other features of C/C++ that make it faster than Java. C/C++ developers really don't want to "pay" for something for safety.

    Though, I really like the _mm_undefined_ps() intrinsics for SSE that make it clear that you're purposefully not initialising a variable. Something like that for ints and floats would be pretty sweet.

    • It is definitely not the case that magically safer is slower. IMO too often the attitude from WG21 (the c++ language committee) has been "Some fast things are unsafe, therefore if we make our language more unsafe it will go faster" which... that's not how implication works.

      As a very high level example, take sorting. Rust's standard library provides you both a stable and unstable sort, as does your C++ standard library.

      The C++ standard promises these sorts have O(n log n) performance, it's unclear in modern C++ if having a nonsensical ordering† is Undefined Behaviour (as it was in older versions) or outright IFNDR (much worse than UB) but the real world effect will be similar anyway

      Rust promises that these sorts work as expected, if you provide nonsensical ordering, obviously it can't very well "sort" things the way you asked, but we don't need to kill your neighbour's cats and wipe the hard disk either, so, it will either give you back the same things in... some order or it will report the fatal error in your software.

      The Rust option here is clearly much safer right? So, how much performance is this costing? Actually, it's faster. So C++ is choosing slower and worse. What's the upside?

      † For example what about if I insist that Red < Green, but also Green < Red, and furthermore Red == Green is true, but so is Red != Green, however neither Green == Red nor Green != Red are true!

    • Statically proving the variables get initialized wouldn't change the performance except by making sure you check the return value of sscanf, or turning refusal to check into a couple register wipes. Either way, that's a negligible increase to a hefty function call. It wouldn't require default initializing variables in all circumstances.

      When I think of the "no runtime cost" mentality of C/C++ I don't think that normally extends to ignoring errors in I/O functions.

    • And yet, there is a good chance that C++ will start doing exactly this [1]. Because [2]:

      > The performance impact is negligible (less that 0.5% regression) to slightly positive (that is, some code gets faster by up to 1%). The code size impact is negligible (smaller than 0.5%). Compile-time regressions are negligible. Were overheads to matter for particular coding patterns, compilers would be able to obviate most of them.

      > The only significant performance/code regressions are when code has very large automatic storage duration objects. We provide an attribute to opt-out of zero-initialization of objects of automatic storage duration. We then expect that programmer can audit their code for this attribute, and ensure that the unsafe subset of C++ is used in a safe manner.

      > This change was not possible 30 years ago because optimizations simply were not as good as they are today, and the costs were too high. The costs are now negligible.

      [1] https://github.com/cplusplus/papers/issues/1401

      [2] https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2023/p27...

      1 reply →

I think the response to that would be: yes but the game would simply not have been made if it wasn't written in C++. That's not to say you couldn't or that you can't make something like GTA:SA in Rust in 2025 or in a safer different language in the early 2000s. It just would take a great deal more time and expense as you'd have needed to construct a lot of tooling and do a lot of training to ensure all of the employees were up to speed before getting started. C++ was, and I think to some extent still is, the lingua franca of the gaming industry - there are some fun exceptions (Naughty Dog implementing much of Crash Bandicoot in a home-grown LISP, and presumably dozens or hundreds of DSLs and other little bespoke scripting languages in use at other studios).

And that's not to mention the uncomfortable truth that while doing this correctly in something like Rust may very well take less effort overall than in C++, that is not the bar we are aiming to clear. They wanted to implement something that was correct-enough, and given that this bug wasn't hit for 20+ years and that the game was a roaring success on all the major platforms - I think that was the right decision.

  • We don't have enough information to claim it's the "right decision" only that this choice did work, not that other choices couldn't have been better.

    In video games you can go back and try another option but life isn't like that and so we can only suppose what might have happened.

    • Well what happened was that despite being based on an aging Renderware engine and programmed using a language with many potential footguns, the game was stable enough across multiple platforms, architectures and OSes that it was both a critical and commercial success.

      I know what you’re saying - you can’t really know what might have been in an alternate reality. But in that alternate reality they’d have had to come up with something truly monumental to outdo themselves here.

      I think you’re just being a wee bit picky about me using the words “the right decision”. If we’re honest with ourselves there probably wasn’t a Rust-like language in the conversation when they set out to build GTA3, Vice City or San Andreas so this is all kind of moot unless we're suggesting that Rockstar should have started out by building that language...

I'd actually say that Rust is a third option between "everything is immutable" and "mutable soup". Rust is more of "one mutator at a time". Because, Rust really embraces being able to mutate stuff (so not functional in that sense), it just makes sure that it's in a controlled way.

The constant rust evangelism on this site is such a turn off from actually wanting to use the language.

  • There'd be a lot less Rust evangelism on this site if there were less UB bug outcomes on this site.

  • If your attitude is just "I'm not going to use abc because too many people say it's good", without even just trying that out first hand to verify those claims, I don't think you can go very far in your technical skills.

    The best engineers I know are open to everything and played with almost every tool/language/whatever to form (sorry) informed opinions about them. They often know what they are talking about, and they choose the best tool for the job.

    • I think I can articulate what the comment means in a way that may make you rethink what you've said a little bit. I'm not wanting to make you think Rust is bad (I personally think it is good) I'm just trying to show you why this person may not be as backwards as you think they are.

      So the person in question is irritated at an interesting blog post about a 20+ year old game being used as another opportunity to push Rust. So for starters Rust obviously wasn't around at the time the game was developed so it's not like Rockstar made the wrong call in implementing this using C++. But more importantly I don't think Rust is currently in a state where studios can justify using it to develop AAA games. They'd need big teams of developers with Rust experience who are well-versed in the sort of problems encountered during game development. You'd need battle-tested build/deployment processes that allow you to produce the binaries for Playstation/Xbox (not too dissimilar CPU/GPU wise, but each with their own platform-level quirks no doubt) and Switch hardware - potentially across multiple generations. You'd need various platforms' OS hooks and network-service APIs available. Additionally you'd need to convince the guys with the money that instead of spending $projected on a game, you'd need to spend $projected+$mystery_number when they take the plunge and write their first game in Rust with new tools etc rather than C++ and everything they currently use. The gaming industry is nothing if not ruthless at making money, if it made financial sense they'd be moving to Rust already - if it will make sense in the future, they'll be planning to do it.

      You've been charitable in your read of the original comment, taking it as "this family of problem does not exist in Rust" - and for what it's worth I agree and really value this. However this other commenter has presumably seen it as a bit more naive and missing the bigger picture, and in combination with other similar experiences is questioning the value of these of glowing testimonies.

      In addition, a lot of people saying "this is great, this is the future!" doesn't necessarily make something good automatically. For about 5+ years here on HN we had legions of people responding "blockchains will fix this" to almost every problem and very confidently declaring the rest of us are luddites for not getting it. I'm obviously not saying Rust is the same, I'm just trying to show that not following the crowd doesn't automatically mean you're the kind who will always fall behind.

      As for how to avoid this? I dunno if you can undo the zillions of RIIR comments that have been floating around since Rust appeared on the scene, but if I was evangelising or even just strongly recommending it I'd just keep in mind that my target audience is maybe sick of seeing the same kinds of comments and would be a bit more creative and/or sensitive in approaching the topic.

  • I don't think mentioning Rust on an article specifically talking about a memory safety bug count as "constant". This is Rust's core strength.

  • While they did mention rust, the actual suggestion was "functional programming and immutability", which to me suggests several other languages first and makes it not really rust evangelism.

FWIW I think a linter or other similar code quality checker would have caught this as well. From a practical perspective (e.g., how do you prevent this from happening again in your game studio's multi-million line code base) that would have been the right thing to do here.

Rust protects you from external file data you read being incorrect?

That's one hell of a language!

  • The code would have failed because you can't use an uninitialized variable, so you would have had to set it to a default. You don't just get random garbage from the stack.

    • You can write a genuine uninitialized local variable in Rust, it's just that you wouldn't do it out of laziness because while in C that's the default in Rust it's a lot of extra work to say "No, I really don't want to initialize this variable" and Rust is like "I mean, if you insist, all I can do is warn you that's a terrible idea".

        int k;  // C makes an uninitialized variable named k - probably bad idea
      
        let k: i32 = unsafe { MaybeUninit::uninit().assume_init() };  // Rust, same bad idea
      

      If we say "I will initialize it - later" that's fine in Rust and you just write the name (and where appropriate type) of the variable and go about your day. The compiler will reject your program if, in fact, it can't see why you're fulfilling that promise, and sometimes that might be because the compiler is dumb (but often it's because you are) but there's no problem technically with this and if the compiler agrees that we do, in fact, initialize it later then it compiles and works and everybody is happy.

      But to actually make a variable and not initialize it, as we saw above, is a lot of extra work in Rust because like... that's a bad idea, why would you be setting out to do that?

      This is such a bad idea that Rust's unsafe std::mem::uninitialized, which is how they did this before MaybeUninit existed, was de-fanged (giving it poor performance by actually writing a pattern to RAM every time) and deprecated so you get a warning if you try to use it even though it was already marked unsafe. See, people (and I'm sure many C programmers are like this) tend to imagine it's OK for say an integer to be uninitialized because surely any possible value is OK, right ? Nope. Your operating system knows that data was never written, and so it feels entitled to fuck you about if you expect it to stay unchanged, because it never promised that will work - as a result rarely but sometimes you get kicked in the head by the OS and you get a seemingly impossible bug.

  • It would have forced you to either specify a default or fail pretty loudly as soon as you launched the game, both much better than leaving a bug there just for it to resurface 20 years later.

  • Most popular languages would prevent this. In this case it’s as simple as having more sensible reader API than sscanf in standard library and forcing variables to be initialized.

  • Of course not, but this here was a memory access error and rust would have prevented this.

Could you elaborate? I cannot see how a functional programming language would have protected you from reading a non existing value while not providing a default

  • It's more that functional languages just happen to be stricter in various ways that would've mitigated against this. You could quite happily design a functional language that has an unsafe equivalent to sscanf in its stdlib, or has big parts of the spec which are "undefined behaviour" that may differ depending on the underlying OS/compiler/runtime/stdlib in use. But the more popular functional languages have gained traction in part because they tend to have a "if you model the types correctly, the program basically works" philosophy around them. I don't think things like Haskell, Ocaml or F# would allow this if you wrote idiomatic code, you'd probably need to do something a little hacky or sketchy.

  • It simply would not have allowed you to write code which did that. And you wouldn't have a function like sscanf() either. You'd probably end up with a much more normal looking parser function that returned a value-or-error type.

  • I've never heard of a functional language that would allow you to initialize a value to whatever value the system memory already had in that memory location. In languages that allow nil, it would at least be nil; in languages that don't, you would have gotten an error about an uninitialized and undefaulted value. In any typed language, you would have also gotten an error.

    It's true that C may be unique-ish in this regard though- this bug also couldn't happen in Ruby, which is not a functional language, but Ruby certainly still makes undefined behaviors much more possible than in other languages like Elixir.