PlayStation 2 Recompilation Project Is Absolutely Incredible

13 hours ago (redgamingtech.com)

> The PlayStation 2’s library is easily among the best of any console ever released, and even if you were to narrow down the list of games to the very best, you’d be left with dozens (more like hundreds) of incredible titles. But the PS2 hardware is getting a bit long in the tooth

Besides the library, the PS2 is the most successful video game console of all time in terms of number of units shipped, and it stayed on the market for over ten years, featured a DVD drive, and at one point was positioned by Sony not just as an entertainment appliance but as a personal computer, including their own official PS2 Linux distribution.

In a more perfect world, this would have:

(a) happened with a hypothetical hardware platform released after the PS2 but before the PS3, with specs lying in between the two: a smidge better than the former, but not quite as exotic as the latter (with its Cell CPU or the weird form factor; whereas the PS2's physical profile in comparison was perfect, whether in the original form or the Slim version), which could have:

(b) resulted in a sort of standardization in the industry like what happened to the IBM PC and its market of clones, with other vendors continuing to manufacture semi-compatible units even if/when Sony discontinued it themselves, periodically revving the platform (doubling the amount of memory here, providing a way to tap into higher clock speeds there) all while maintaining backwards compatibility such that you would be able to go out today and buy a brand new, $30 bargain-bin, commodity "PS2 clone" that can do basic computing tasks on it (in other words, not including the ability to run a modern Web browser or Electron apps), can play physical media, and supports all the original games and any other new games that explicitly target(ed) the same platform, or you could pay Steam Machine 2026 prices for the latest-gen "PS2" that retains native support for the original titles of the very first platform revision but unlocks also the ability to play those for every intermediate rev, too.

  • As owner of PS2 Linux distribution and related hardware, it was sort of ok.

    Sony intended it to be the evolution of Playstation Yaroze, fostering indie development, instead people used it mostly to run emulators on the PS2, hence why the PS3 version lost access to accelerated hardware for graphics.

    PS2 Linux had hardware acceleration, the only difference was that the OpenGL inspired API did not expose all the capabilities of a regular DevKit.

    Community proved that the development effort wasn't worth it.

    The XBox arcade and ID@XBox programs have also taken these lessons into account, which is why you only see everyone running emulators on rooted XBoxes, not the developer mode ones.

    The market of IBM PC clones only happened because of an IBM mistake, that was never supposed to happen, and IBM tried with the PS2 / MCA to take their control back, but the Pandora box was already open, and Compaq was clever with the way they did reverse engineer the BIOS.

  • > (a) happened with a hypothetical hardware platform released after the PS2 but before the PS3, with specs lying in between the two

    I would argue strongly that the weak hardware is why the PS2, and other old consoles, were so good, and that by improving the hardware you cannot replicate what they accomplished (which is why, indeed, newer consoles have never managed to be as iconic as older consoles). You can make an equally strong case that the Super Famicom is the best console of all time, with dozens of 10/10 games that stand the test of time. I think the limitations of the hardware played a pivotal role in both, as they demanded good stylistic decisions to create aesthetically appealing games with limited resources, and demanded a significant level of work into curating and optimizing the game design, because every aspect of the game consumed limited resources and therefore bad ideas had to be culled, leaving a well-polished remainder of the best ideas in a sort of Darwinian sense.

    > (b) resulted in a sort of standardization in the industry like what happened to the IBM PC and its market of clones, with other vendors continuing to manufacture semi-compatible units

    Unlike the PC market, the comprehensive list of "other vendors" is two entries long. Is it a more perfect world if Nintendo manufactures knockoff Playstations instead of its variety of unique consoles? I don't think so.

    • I love retro consoles as much as the next middle aged software developer, but realistically, the reason those consoles are so iconic is because we were children. Every console generation is that special generation for one group of kids.

      I do agree that sometimes limitations breed creativity, but that’s not the only thing that can make the magic work.

      11 replies →

    • This might be a nitpick, but I could probably only count 5-10 SNES games that would be considered 10/10 IMO, and not many that I think are worth sinking decent time into these days, compared to something like Burnout Revenge - a great game but certainly not a 10/10 game.

      Still, I do find the SNES library, and 16bit games in general, quite astounding from a creative and artistic perspective, but not so much from a player’s perspective.

      6 replies →

  • > and at one point was positioned by Sony not just as an entertainment appliance but as a personal computer with their own official PS2 Linux distribution.

    to avoid EU import taxes

  • it was a dreadful, useless computer, even then

    • Unlike the PS3 which the US Air Force bought 1,760 and clustered into the 33rd most powerful** at the time.

      (**Distributed computing is very cheat-y compared to a "real" supercomputer which has insane RDMA capabilities)

      1 reply →

This is cool but of course it's only going to be a small handful of titles that ever receive this kind of attention. But I have been blown away that now sub-$300 Android handhelds are more than capable of emulating the entire PS2 library, often with upscaling if you prefer.

  • Moore's law never ceases to amaze (the vulgar version where we're talking compute/dollar, not the transistor count doubling rate.) It won't be too long before phones are running AI models with performance equal to or better than current frontier models running on $100 million dollar clusters. It's hard to even imagine the things that will be running on billion dollar clusters in 10 years.

    • > ... It won't be too long before phones are running AI models with performance equal to or better than current frontier models running on $100 million dollar clusters.

      Maybe, perhaps phones will have the compute power... But not enough memory. If things continue the way they are, that is. Great for AI firms, they'll have their moat.

    • In the same way we have websites running on disposable vapes, it may not be long before such a device could run a small local LLM, and lots of appliances could have a local voice interface - so you literally talk to your microwave!

    • I do hope you're right, but I'm quite skeptical. As mobile devices get more and more locked down, All that memory capacity gets less and less usable. I'm sure it will be accessible to Apple and Google models, but models that obey the user? Not likely

      4 replies →

    • They will not build that phone because then you won’t subscribe to AI cloud platforms.

  • It really is incredible. I've been playing through my childhood games on retro handhelds, and recently jumped from <$100 handhelds to a Retroid Pocket Flip, and it's incredible. Been playing WiiU and PS2 games flawlessly at 2x res, and even tackling some lighter Switch games on it.

    • It truly is. My issue though, like in 2010 when I built an arcade cabinet capable of playing everything is you eventually just run out of interest. In it all. Not even the nostalgia of it keeps my attention. With the exception of just a small handful of titles.

      - Excite Bike (it’s in its own league) NES

      - Punchout (good arcade fun) NES

      - TMNT 4-P Coop Mame Version

      - NBA Jam Mame Version

      - Secret of Mana SNES

      - Chronotrigger SNES

      - Breath of Fire 2 SNES

      - Mortal Kombat Series SEGA32X

      - FF Tactics PS1

      I know these can all be basically run in a browser at this point but even Switch or Dreamcast games were meh. N64/PS1/PS2/Xbox was peak and it’s been rehashed franchises ever since. Shame. The only innovative thing that has happened since storytelling died has been Battle Royale Looter Shooters.

      72 replies →

  • And then folks waste whole that power away, with embedded widgets applications.

    My Android phone is more powerful than the four PCs I owned during the 1990 - 2002, 386SX - P75 - P166 - Athlon XP, all CPU, GPU, RAM and disk space added together.

  • I'll take a longbet with you that this or successors tackle more than a small handful of titles

    We live in interesting times

  • There is so much work hunting down the proper upscaled/improved texture packs though. Supposedly.

  • I gave up video games, but I remember that being a huge reason why I picked Android a decade + ago. Emulators :D

    Apparently now iphone allows it. Eventually Apple gives features that are standard elsewhere. Veblen goods...

  • I suspect we will see a proliferation of emulator development in the next few years.

    In a lot of ways, emulators are the perfect problem for vision/LLMs. It's like all those web browser projects popping up on HN. You have a very well define problem with existing reference test cases. It's not going to be fun for Nintendo's lawyers in future when everybody can crowdfund an emulator by simply running a VLM against a screen recording of gameplay (barring non deterministic éléments).

    They can't oppress the software engineering masses any longer through lawfare.

  • What the dev of AertherSx2 did to run games smooth, even on my midrange 2019 android phone, is wonders.

    Too bad the dev is a very emotionally unstable person that abandoned his port, despite his big talent.

90% of the PS2’s floating point throughput is in the two vector units, not the R5900 conducting them. Concentrating on that, as the article does, seems as futile as focussing on the 68000 rather than the Amiga PAD in a 16-bit context (ignoring the EE’s 16-bit RAMBUS bottleneck).

However that approach will probably suit the least-ambitious PC-ports to PS2 (by studios that didn’t appreciate the difference) - rather as an ST emulator was a short cut to run the simplest Amiga games.

  • Hey! I can speak here.

    Back in the day, I wrote a simulator for the PS2’s vector units because Sony did not furnish any debugger for them. A month after I got it working, a Sony 2nd party studio made their VU debugger available to everyone… Anyway…

    The good news is that the VU processors are actually quite simple as far as processors go. Powerful. Complicated to use. But, not complicated to specify.

    This is made much simpler by the fact that the only documentation Sony provided was written by the Japanese hardware engineers. It laid out the bit-by-bit details of the instruction set. And, the bitwise inputs, outputs, delays and side effects of each instruction.

    No guidance on how to use it. But, awesome docs for writing a simulator (or recompiler).

An application of the first Futamura projection. https://en.wikipedia.org/wiki/Partial_evaluation

  • Is it? It would be if it partially evaluated a MIPS emulator on a particular game. But it doesn't seem to work like that.

    • "Decoding the MIPS R5900 instructions in each function Translating those instructions to equivalent C++ code Generating a runtime that can execute the recompiled code The translated code is very literal, with each MIPS instruction mapping to a C++ operation." It sounds like a MIPS interpreter that gets statically unrolled.

      1 reply →

I absolutely love the idea!

As a movie geek I'm personally offended when someone says "oh, it's from 2017, it's an old movie!", or "I don't want to see anything from 90s, yuck" - and that's pretty common.

Of course, "Nosferatu, eine Symphonie des Grauens" is not for everyone, but I firmly believe that you can watch the new Dune and Lawrence of Arabia back to back and have similarly enjoyable time.

Fallout 1 and 2 are miles ahead of Fallout 3 (mostly due to uncanny valley phenomenon). Sure, the medium has changed a lot and modern consumers are used to more streamlined experience - my favorite example is the endless stream of Baldurs Gate "modern reimplementations" or rehashes, like Pilars of Eterniety that were too close to the original source, and then, suddenly, someone came up with Divinity, basically a Baldurs clone but with modern UI and QoL improvements.

But consoles are different.

This can truly be a window for the next generation to look back in the past.

  • > As a movie geek I'm personally offended when someone says "oh, it's from 2017, it's an old movie!", or "I don't want to see anything from 90s, yuck" - and that's pretty common.

    Now I feel old. I was thinking you might say 1960 or something.

    • I recently had a chat with a collegue who never heard of Quake.

      He also never watched Lock, Stock and 2 Smoking Barrels.

      And Half-life is just something-something-let-me-check.

      Oh, well...

Emulation is already amazing. What can be done with recompilation is magic: https://github.com/Zelda64Recomp/Zelda64Recomp

  • So… What’s the magic? (In theory, interpretation/emulation and compilation should produce identical behavior.)

    • The magic is that now you can modify the source code of the game and recompile that.

      Folks have been optimizing SuperMario64 to run much faster on actual N64 hardware. And, there is a project that has ported it to run on the PlayStation 1. That’s much weaker hardware that has no hope of emulating the N64.

    • Identical behavior, sure, but much less overhead and fewer restrictions on e.g. resolution than you'd get on a general purpose emulator

I hope the steam machine 2.0 can be a good target for developers for years to come like the ps2 was

See also: XenonRecomp, which does the same thing for Xbox 360, and N64:Recompiled which does the same thing for N64.

Note that this "recompilation" and the "decompilation" projects like the famous Super Mario 64 one are almost orthogonal approaches in a way that the article failed to understand; this approach turns the assembly into C++ macros and then compiles the C++ (so basically using the C++ compiler as a macro re-assembler / emulation recompiler in a very weird way). The famous Super Mario 64 decompilation (and openrct and so on) use the output from an actual decompiler which attempts to reconstruct C from assembly, and then modify that code accordingly (basically, converting the game's object code back into some semblance of its source code, which this approach does NOT do).

I have a samurai game, Kengo 3, that I really liked on PS2. I still have that CD at my parents'. Can anyone recommend me a PS2 emulator?

I’ve been meaning to start decompiling one of my favorite games of the era (Hulk Ultimate Destruction) after watching the decomp of other games. Perhaps this is a sign to start?

> So yes, currently playing PS2 games on PC via emulator is still absolutely fantastic, but native ports would be the holy grail of game preservation.

I would think that emulation of the original game as closely as possible would be the gold standard of preservation, and native ports would be a cool alternative. As described in the article, native ports are typically not faithful reproductions but enhanced to use the latest hardware.

  • Indeed, the focus for preservation would be to increase the accuracy of emulators.

    pcsx2 is pretty good today in terms of running games (there is a single digit list of games it does not run), but it's far from accurate to the hardware.

    Porting to current systems via recompilation is cool, but it has very little to do with preservation.

I wonder how they will tackle the infamous non-conformant Ps2 floating-point behavior issue, that is the biggest hurdle on emulating Ps2.

This sounds very cool, but I can practically hear the IP lawyers sharpening their buzz-axes...

  • They haven't been all that aggressive against the decompile/recompile projects, interestingly. They're sometimes/often set up so you need the original to grab assets etc., but that code is copyrighted too and I'd have to imagine a decompile that purposely compiles to an identical binary would be a derivative work.

    My best guess is that for them it's not worth the hassle or any possibility of a negative result in court as long as people have to jump through some hoops by providing an original, and for the projects that don't do that, you have very straightforward easy infringement cases without even getting into the decomp stuff. Though really even ROMs seem to be tacitly tolerated to some extent lately. Maybe there's an attitude that keeping people involved with the franchise is worth it, again so long as it doesn't become too easy.

  • Sony have actually been fairly chill about emulators etc. so I'd be surprised if lawyers got involved here.

    They actually used an open source Playstation emulator when they released the "Playstation Classic" in 2018.

This is amazing for preservation. Being able to run these classics on modern hardware with native recompilation is a huge step forward.

As far as I know, static recompilation is thwarted by self modifying code (primarily JITs) and the ability to jump to arbitrary code locations at runtime.

The latter means that even in the absence of a JIT, you would need to achieve 100% code coverage (akin to unit testing or fuzzing) to perform static recompilation, otherwise you need to compile code at runtime at which point you're back to state of the art emulation with a JIT. The only real downside of JITs is the added latency similar to the lag induced by shader compilation, but this could be addressed by having a smart code cache instead. That code cache realistically only needs to store a trace of potential starting locations, then the JIT can compile the code before starting the game.

  • Yes, but in practice that isn't a problem. People do write self modifying code, and jump to random places today. However it is much less common today than in the past. IT is safe to say that most games are developed and run on the developers PC and then ported to the target system. If they know the target system they will make sure it works on the system from day one, but most developers are going to prefer to run their latest changes on their current system over sending it to the target system. If you really need to take advantage of the hardware you can't do this, but most games don't.

    Many games are written in a high level language (like C...) which doesn't give you easy access to self modifying code. (even higher level languages like python do, but they are not compiled and so not part of this discussion). Likewise, jumping to arbitrary code is limited to function calls for most programmers.

    Many games just run on a game engine, and the game engine is something we can port or rewrite to other systems and then enable running the game.

    Be careful of the above: most games don't become popular. It is likely the "big ticket games" people are most interested in emulating had the development budget and need to take advantage of the hardware in the hard ways. That is the small minority of exceptions are the ones we care about the most.

    • This is PS2 emulation, where most engines were still bespoke and every hack in the book was still on the table.

  • JIT isn't _that_ common in games (although it is certainly present in some, even from the PS2 era), but self-modifying or even self-referencing executables were a quite common memory saving trick that lingered into the PS2 era - binaries that would swap different parts in and out of disk were quite common, and some developers kept using really old school space-saving tricks like reusing partial functions as code gadgets, although this was dying out by the PS2 era.

    Emulation actually got easier after around the PS2 era because hardware got a little closer to commodity and console makers realized they would need to emulate their own consoles in the future and banned things like self-modifying code as policy (AFAIK, the PowerPC code segment on both PS3 and Xbox 360 is mapped read only; although I think SPE code could technically self-modify I'm not sure this was widespread)

    The fundamental challenges in this style of recompilation are mostly offset jump tables and virtual dispatch / function pointer passing; this is usually handled with some kind of static analysis fixup pass to deal with jump tables and some kind of function boundary detection + symbol table to deal with virtual dispatch.

  • How many PS2-era games used JIT? I would be surprised if there were many of them - most games for the console were released between 2000 and 2006. JIT was still considered a fairly advanced and uncommon technology at the time.

    • I'd say practically none, we were quite memory starved most of the time and even regular scripting engines were a hard sell at times (perhaps more so due to GC rather than interpretation performance).

      Games on PS2 were C or C++ with some VU code (asm or some specialized hll) for most parts, often Lua(due to low memory usage) or similar scripting added for minor parts with bindings to native C/C++ functions.

      "Normal" self-modifying code went out of favour a few years earlier in the early-mid 90s, and was perhaps more useful on CPU's like the 6502s or X86's that had few registers so adjusting constants directly into inner-loops was useful (The PS2 MIPS cpu has plenty of registers, so no need for that).

      However by the mid/late 90s CPU's like the PPro already added penalties for self-modifying code so it was already frowned on, also PS2 era games already often ran with PC-versions side-by-side so you didn't want more than needed platform dependencies.

      Most PS2 performance tuning we did was around resources/memory, VU and helped by DMA-chains.

      Self modifying code might've been used for copy-protection but that's another issue.

    • A lot of PS2-era games unfortunately used various self-modifying executable tricks to swap code in and out of memory; Naughty Dog games are notorious for this. This got easier in the Xbox 360 and PS3 era where the vendors started banning self-modifying code as a matter of policy, probably because they recognized that they would need to emulate their own consoles in the future.

      The PS2 is one of the most deeply cursed game console architectures (VU1 -> GS pipeline, VU1 microcode, use of the PS1 processor as IOP, etc) so it will be interesting to see how far this gets.

      2 replies →

Side note, are we at the level where tech blogs and news site can't even write <a href> links properly?

2 out of 4 links in the article are messed up, that's mind boggling... On a tech blog!

Is that how far deep we've sunk to assert it wasn't written by AI?

  • A more accurate version of the famous idiom:

    Those who can, do (and sometimes become teachers when they get older). Those who can’t become journalists.