← Back to context

Comment by bluedino

1 day ago

Why weren't binary files used like I would expect in the 1990's DOS game? fread into a struct and all that

By the 2000s, portability was a concern for most titles. Certainly anything targeted at a rapidly changing console market back then.

  • Definitely, and architectures back then were far less standardized. The Xbox 360 was a big-endian PowerPC CPU, the PS2 had a custom RISC-based CPU. On the desktop, this was still the era of PowerPC-based Macs. Far easier (and I would argue safer) to use a standard, portable sscanf-like function with some ascii text, than figure out how to bake your binaries into every memory and CPU layout combination you might care about.

Easier for internal development. Non- or less technical team members can tweak values without having to rebuild these binary files. Possibly also easier for lightweight modding externally as well.

This isn't that uncommon - look at something like Diablo 2 which has a huge amount of game data defined from text files (I think these are encoded to binary when shipped but it was clearly useful to give the game a mode where it'd load them all from text on startup).

Video games are made by a lot of non-programmers who will be much more comfortable adjusting values in a text file than they are hex editing something.

Besides, the complaint about not having a heavyweight parser here is weird. This is supposed to be "trusted data", you shouldn't have to treat the file as a threat, so a single line sscanf that's just dumping parsed csv attributes into memory is pretty great IMO.

Definitely initialize variables when it comes to C though.