PlayStation 2 Recompilation Project Is Absolutely Incredible

9 days ago (redgamingtech.com)

> The PlayStation 2’s library is easily among the best of any console ever released, and even if you were to narrow down the list of games to the very best, you’d be left with dozens (more like hundreds) of incredible titles. But the PS2 hardware is getting a bit long in the tooth

Besides the library, the PS2 is the most successful video game console of all time in terms of number of units shipped, and it stayed on the market for over ten years, featured a DVD drive, and at one point was positioned by Sony not just as an entertainment appliance but as a personal computer, including their own official PS2 Linux distribution.

In a more perfect world, this would have:

(a) happened with a hypothetical hardware platform released after the PS2 but before the PS3, with specs lying in between the two: a smidge better than the former, but not quite as exotic as the latter (with its Cell CPU or the weird form factor; whereas the PS2's physical profile in comparison was perfect, whether in the original form or the Slim version), which could have:

(b) resulted in a sort of standardization in the industry like what happened to the IBM PC and its market of clones, with other vendors continuing to manufacture semi-compatible units even if/when Sony discontinued it themselves, periodically revving the platform (doubling the amount of memory here, providing a way to tap into higher clock speeds there) all while maintaining backwards compatibility such that you would be able to go out today and buy a brand new, $30 bargain-bin, commodity "PS2 clone" that can do basic computing tasks on it (in other words, not including the ability to run a modern Web browser or Electron apps), can play physical media, and supports all the original games and any other new games that explicitly target(ed) the same platform, or you could pay Steam Machine 2026 prices for the latest-gen "PS2" that retains native support for the original titles of the very first platform revision but unlocks also the ability to play those for every intermediate rev, too.

  • > (a) happened with a hypothetical hardware platform released after the PS2 but before the PS3, with specs lying in between the two

    I would argue strongly that the weak hardware is why the PS2, and other old consoles, were so good, and that by improving the hardware you cannot replicate what they accomplished (which is why, indeed, newer consoles have never managed to be as iconic as older consoles). You can make an equally strong case that the Super Famicom is the best console of all time, with dozens of 10/10 games that stand the test of time. I think the limitations of the hardware played a pivotal role in both, as they demanded good stylistic decisions to create aesthetically appealing games with limited resources, and demanded a significant level of work into curating and optimizing the game design, because every aspect of the game consumed limited resources and therefore bad ideas had to be culled, leaving a well-polished remainder of the best ideas in a sort of Darwinian sense.

    > (b) resulted in a sort of standardization in the industry like what happened to the IBM PC and its market of clones, with other vendors continuing to manufacture semi-compatible units

    Unlike the PC market, the comprehensive list of "other vendors" is two entries long. Is it a more perfect world if Nintendo manufactures knockoff Playstations instead of its variety of unique consoles? I don't think so.

    • I love retro consoles as much as the next middle aged software developer, but realistically, the reason those consoles are so iconic is because we were children. Every console generation is that special generation for one group of kids.

      I do agree that sometimes limitations breed creativity, but that’s not the only thing that can make the magic work.

      57 replies →

    • This might be a nitpick, but I could probably only count 5-10 SNES games that would be considered 10/10 IMO, and not many that I think are worth sinking decent time into these days, compared to something like Burnout Revenge - a great game but certainly not a 10/10 game.

      Still, I do find the SNES library, and 16bit games in general, quite astounding from a creative and artistic perspective, but not so much from a player’s perspective.

      13 replies →

    • > Unlike the PC market, the comprehensive list of "other vendors" is two entries long

      Before there was “a sort of standardization in the industry” the comprehensive list of “PC vendors” was one entry long.

      Years before that, there were several times there was “a sort of standardization in the industry”, both of which led to there being many vendors.

      - the Altair bus. https://en.wikipedia.org/wiki/S-100_bus#IEEE-696_Standard: “In May 1978, George Morrow and Howard Fullmer published a "Proposed Standard for the S-100 Bus" noting that 150 vendors were already supplying products for the S-100 Bus”

      - CP/M. https://en.wikipedia.org/wiki/CP/M#Derivatives: “CP/M eventually became the de facto standard and the dominant operating system for microcomputers, in combination with the S-100 bus computers. This computer platform was widely used in business through the late 1970s and into the mid-1980s.”

    • This reminded me of the following quote "Limitation breeds creativity", and therefore the PS2's limitations where instrumental to it's success.

      The PS2 in may ways was a great improvement on the PS1 however it was not easy to develop for and could do certain things very well, other things not so well. One example is the graphics due to the unusual architecture of the Emotion Engine (gpu). I think this forced the developers to consider what their games really required and where they wanted to spend the development effort, one of the key ingredients for good game design.

      Additionally the release hype of the PS2 was quite big and the graphics that where achievable where very good at the time, so developers wanted to go through the development pains to create a game for this console.

      Not to forget besides the mountain of great titles for the PS2 there is also a mountain of flopped games that faded into obscurity.

  • > and at one point was positioned by Sony not just as an entertainment appliance but as a personal computer with their own official PS2 Linux distribution.

    to avoid EU import taxes

  • As owner of PS2 Linux distribution and related hardware, it was sort of ok.

    Sony intended it to be the evolution of Playstation Yaroze, fostering indie development, instead people used it mostly to run emulators on the PS2, hence why the PS3 version lost access to accelerated hardware for graphics.

    PS2 Linux had hardware acceleration, the only difference was that the OpenGL inspired API did not expose all the capabilities of a regular DevKit.

    Community proved that the development effort wasn't worth it.

    The XBox arcade and ID@XBox programs have also taken these lessons into account, which is why you only see everyone running emulators on rooted XBoxes, not the developer mode ones.

    The market of IBM PC clones only happened because of an IBM mistake, that was never supposed to happen, and IBM tried with the PS2 / MCA to take their control back, but the Pandora box was already open, and Compaq was clever with the way they did reverse engineer the BIOS.

  • > featured a DVD drive

    Wasn't it also among the cheapest DVD players on the market back then?

    • Yes, it was like the same price (close enough) as a regular Sony DVD player, which was nuts.

      There were cheaper off-brand DVD players, of course.

      You did have to buy a remote separately, though, unless you wanted to use the game controller (which had a cord).

      1 reply →

  • it was a dreadful, useless computer, even then

    • Unlike the PS3 which the US Air Force bought 1,760 and clustered into the 33rd most powerful** at the time.

      (**Distributed computing is very cheat-y compared to a "real" supercomputer which has insane RDMA capabilities)

      3 replies →

    • "it was a dreadful, useless computer, even then"

      So you don't dispute the thesis that the hypothetical general-purpose machine described in the comment would have needed to have been been better than the PS2?

This is cool but of course it's only going to be a small handful of titles that ever receive this kind of attention. But I have been blown away that now sub-$300 Android handhelds are more than capable of emulating the entire PS2 library, often with upscaling if you prefer.

  • Moore's law never ceases to amaze (the vulgar version where we're talking compute/dollar, not the transistor count doubling rate.) It won't be too long before phones are running AI models with performance equal to or better than current frontier models running on $100 million dollar clusters. It's hard to even imagine the things that will be running on billion dollar clusters in 10 years.

    • I do hope you're right, but I'm quite skeptical. As mobile devices get more and more locked down, All that memory capacity gets less and less usable. I'm sure it will be accessible to Apple and Google models, but models that obey the user? Not likely

      5 replies →

    • It might not be in our lifetimes... the frontier models are using terabytes of RAM. In 10 years iPhones went from ~2GB to ~8GB.

      2012 Macbook pros had up to 16gb, 2026 maxes out at 64gb. So 4x increase in 16 years. 1996 Mac desktop had 16MB of ram, so from 1996-2012 there was a 1000x increase.

      We won't see gains like we did from the 80s-2000s again.

    • > ... It won't be too long before phones are running AI models with performance equal to or better than current frontier models running on $100 million dollar clusters.

      Maybe, perhaps phones will have the compute power... But not enough memory. If things continue the way they are, that is. Great for AI firms, they'll have their moat.

      2 replies →

    • In the same way we have websites running on disposable vapes, it may not be long before such a device could run a small local LLM, and lots of appliances could have a local voice interface - so you literally talk to your microwave!

    • They will not build that phone because then you won’t subscribe to AI cloud platforms.

  • It really is incredible. I've been playing through my childhood games on retro handhelds, and recently jumped from <$100 handhelds to a Retroid Pocket Flip, and it's incredible. Been playing WiiU and PS2 games flawlessly at 2x res, and even tackling some lighter Switch games on it.

    • It truly is. My issue though, like in 2010 when I built an arcade cabinet capable of playing everything is you eventually just run out of interest. In it all. Not even the nostalgia of it keeps my attention. With the exception of just a small handful of titles.

      - Excite Bike (it’s in its own league) NES

      - Punchout (good arcade fun) NES

      - TMNT 4-P Coop Mame Version

      - NBA Jam Mame Version

      - Secret of Mana SNES

      - Chronotrigger SNES

      - Breath of Fire 2 SNES

      - Mortal Kombat Series SEGA32X

      - FF Tactics PS1

      I know these can all be basically run in a browser at this point but even Switch or Dreamcast games were meh. N64/PS1/PS2/Xbox was peak and it’s been rehashed franchises ever since. Shame. The only innovative thing that has happened since storytelling died has been Battle Royale Looter Shooters.

      133 replies →

  • And then folks waste whole that power away, with embedded widgets applications.

    My Android phone is more powerful than the four PCs I owned during the 1990 - 2002, 386SX - P75 - P166 - Athlon XP, all CPU, GPU, RAM and disk space added together.

  • I'll take a longbet with you that this or successors tackle more than a small handful of titles

    We live in interesting times

  • I find PS2 emulation to be lacking.

    Of course I am spoiled by Dolphin and their meticulous work, and the leap in N64 emulation, and PS3 emulation is way farther than I thought it could ever be.

    But PCSX2 is mediocre. It reports the vast majority of the library in "green" emulation state, but that usually means there are glaring issues that someone is choosing to overlook, like shadows that are broken.

    The Ace Combat games for example are all broken with the hardware accelerated renderer. Things run like garbage in the software renderer for a lot of games. Multiplayer functionality is spotty and hard to set up and poorly documented.

    The state of emulation of that console generation is not up to snuff, save for Dolphin. It's still very much in the "Shut up, it works fine for Super Mario 64 so it works" mindset it seems.

    This is true even of official emulators! The Xbox emulator that ran on the Xbox 360 has many games that are "officially supported" with serious issues. Forza Motorsport 1 has weird slowdowns on key tracks. I understand the serious hardware difficulties but I still wish emulation accuracy was an option.

  • I suspect we will see a proliferation of emulator development in the next few years.

    In a lot of ways, emulators are the perfect problem for vision/LLMs. It's like all those web browser projects popping up on HN. You have a very well define problem with existing reference test cases. It's not going to be fun for Nintendo's lawyers in future when everybody can crowdfund an emulator by simply running a VLM against a screen recording of gameplay (barring non deterministic éléments).

    They can't oppress the software engineering masses any longer through lawfare.

  • I gave up video games, but I remember that being a huge reason why I picked Android a decade + ago. Emulators :D

    Apparently now iphone allows it. Eventually Apple gives features that are standard elsewhere. Veblen goods...

  • Emulation is amazing for access right now. Recompilation is about making sure MGS2 or GT4 still runs in 2045 on whatever weird hardware we're using then

  • There is so much work hunting down the proper upscaled/improved texture packs though. Supposedly.

  • What the dev of AertherSx2 did to run games smooth, even on my midrange 2019 android phone, is wonders.

    Too bad the dev is a very emotionally unstable person that abandoned his port, despite his big talent.

On this topic of ports/recomps there's also OpenGOAL [1] which is a FOSS desktop native implementation of the GOAL (Game Oriented Assembly Lisp) interpreter [2] used by Naughty Dog to develop a number of their famous PS2 titles.

Since they were able to port the interpreter over they have been able to start rapidly start porting over these titles even with a small volunteer team.

1. https://opengoal.dev/

2. https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp

  • Thats incredible, I had no idea Jak&Daxter was written with Emacs as the primary IDE!

    • Lol yep. Emacs as the IDE, Allegro Common Lisp as the interpreter + HAL implementation, and GOAL itself being a Scheme-like.

      Naughty Dog in general was actually a primarily Lisp studio for a long time. It was only in the PS3 era with Uncharted and The Last of Us that they switched to C++ because trying to maximise the performance out of a Lisp interpreter environment with the complexity the Cell Processors added on a time and cash budget simply wasn't feasible for them.

      The Crash Bandicoot games were written in GOOL (Game Oriented Object Lisp) which they wrote prior to GOAL and the Jak and Daxter games. GOOL/Lisp of course was extremely important for the Crash Bandicoot legacy because by writing their own higher level interpreter they were given an excuse to through away the entire standard library that Sony gave them and start from scratch. That process allowed them to write a massively more performant stdlib and execution environment leading to Crash Bandicoot being able to support game environments an order of magnitude more complex than other games at the time could. And of course this allowed them to build in a system for lazy loading the environment as the player progressed through the levels which firmly cemented Naughty Dog in the video games history books.

      Andy Gavin actually has an incredible blog site (including a 13 part series on Crash Bandicoot and a 5 part series on Jak and Daxter) that has over the decades documented the history of their studio's game development process and all the crazy things they did to make their games work on hardware where it really shouldn't have been able to with the tools they were provided.

      https://all-things-andy-gavin.com/video-games-archive/

      3 replies →

  • If PS2Recomp ends up giving us even a fraction of what OpenGOAL unlocked for Jak and Daxter, it could be a huge deal for the rest of the PS2 catalog

    • Absolutely. OpenGOAL really just set a new standard for what games preservation looks like.

      It's incredible seeing the community taking a 25 year old game, modernising it with accessibility features and quality of life, and even creating entirely new expansions to the game [1].

      Like beyond just keeping the game preserved on modern platforms, it's keeping the spirit of the game and the community attached to it alive as well in a way that it can continue to evolve and grow.

      I can only pray that PS2Recomp makes this a fraction as accessible to other games from this era.

      Oh and a similar project but on the nintendo side of the world is Ship of Harkinian by HarbourMasters [2] and the Zelda RE Team [3]. Zelda RET have half the Zelda games and are well on their way decompiling and reverse engineering the other half. And HarbourMasters have taken these decomps and used them as the groundwork for building comprehensive ports and remasters of these original games to a degree that fans could only dream that first party remasters and ports would attempt.

      1. https://www.youtube.com/watch?v=PIrSHz4qw3Q

      2. https://www.shipofharkinian.com/

      3. https://zelda.deco.mp/

> So yes, currently playing PS2 games on PC via emulator is still absolutely fantastic, but native ports would be the holy grail of game preservation.

I would think that emulation of the original game as closely as possible would be the gold standard of preservation, and native ports would be a cool alternative. As described in the article, native ports are typically not faithful reproductions but enhanced to use the latest hardware.

  • Indeed, the focus for preservation would be to increase the accuracy of emulators.

    pcsx2 is pretty good today in terms of running games (there is a single digit list of games it does not run), but it's far from accurate to the hardware.

    Porting to current systems via recompilation is cool, but it has very little to do with preservation.

I absolutely love the idea!

As a movie geek I'm personally offended when someone says "oh, it's from 2017, it's an old movie!", or "I don't want to see anything from 90s, yuck" - and that's pretty common.

Of course, "Nosferatu, eine Symphonie des Grauens" is not for everyone, but I firmly believe that you can watch the new Dune and Lawrence of Arabia back to back and have similarly enjoyable time.

Fallout 1 and 2 are miles ahead of Fallout 3 (mostly due to uncanny valley phenomenon). Sure, the medium has changed a lot and modern consumers are used to more streamlined experience - my favorite example is the endless stream of Baldurs Gate "modern reimplementations" or rehashes, like Pilars of Eterniety that were too close to the original source, and then, suddenly, someone came up with Divinity, basically a Baldurs clone but with modern UI and QoL improvements.

But consoles are different.

This can truly be a window for the next generation to look back in the past.

  • > As a movie geek I'm personally offended when someone says "oh, it's from 2017, it's an old movie!", or "I don't want to see anything from 90s, yuck" - and that's pretty common.

    Now I feel old. I was thinking you might say 1960 or something.

    • I recently had a chat with a collegue who never heard of Quake.

      He also never watched Lock, Stock and 2 Smoking Barrels.

      And Half-life is just something-something-let-me-check.

      Oh, well...

90% of the PS2’s floating point throughput is in the two vector units, not the R5900 conducting them. Concentrating on that, as the article does, seems as futile as focussing on the 68000 rather than the Amiga PAD in a 16-bit context (ignoring the EE’s 16-bit RAMBUS bottleneck).

However that approach will probably suit the least-ambitious PC-ports to PS2 (by studios that didn’t appreciate the difference) - rather as an ST emulator was a short cut to run the simplest Amiga games.

  • Hey! I can speak here.

    Back in the day, I wrote a simulator for the PS2’s vector units because Sony did not furnish any debugger for them. A month after I got it working, a Sony 2nd party studio made their VU debugger available to everyone… Anyway…

    The good news is that the VU processors are actually quite simple as far as processors go. Powerful. Complicated to use. But, not complicated to specify.

    This is made much simpler by the fact that the only documentation Sony provided was written by the Japanese hardware engineers. It laid out the bit-by-bit details of the instruction set. And, the bitwise inputs, outputs, delays and side effects of each instruction.

    No guidance on how to use it. But, awesome docs for writing a simulator (or recompiler).

  • I think you're actually describing why recompilation is interesting here rather than why it's futile

An application of the first Futamura projection. https://en.wikipedia.org/wiki/Partial_evaluation

  • Is it? It would be if it partially evaluated a MIPS emulator on a particular game. But it doesn't seem to work like that.

    • "Decoding the MIPS R5900 instructions in each function Translating those instructions to equivalent C++ code Generating a runtime that can execute the recompiled code The translated code is very literal, with each MIPS instruction mapping to a C++ operation." It sounds like a MIPS interpreter that gets statically unrolled.

      1 reply →

Emulation is already amazing. What can be done with recompilation is magic: https://github.com/Zelda64Recomp/Zelda64Recomp

  • So… What’s the magic? (In theory, interpretation/emulation and compilation should produce identical behavior.)

    • The magic is that now you can modify the source code of the game and recompile that.

      Folks have been optimizing SuperMario64 to run much faster on actual N64 hardware. And, there is a project that has ported it to run on the PlayStation 1. That’s much weaker hardware that has no hope of emulating the N64.

    • Identical behavior, sure, but much less overhead and fewer restrictions on e.g. resolution than you'd get on a general purpose emulator

I wonder how they will tackle the infamous non-conformant Ps2 floating-point behavior issue, that is the biggest hurdle on emulating Ps2.

I hope the steam machine 2.0 can be a good target for developers for years to come like the ps2 was

As far as I know, static recompilation is thwarted by self modifying code (primarily JITs) and the ability to jump to arbitrary code locations at runtime.

The latter means that even in the absence of a JIT, you would need to achieve 100% code coverage (akin to unit testing or fuzzing) to perform static recompilation, otherwise you need to compile code at runtime at which point you're back to state of the art emulation with a JIT. The only real downside of JITs is the added latency similar to the lag induced by shader compilation, but this could be addressed by having a smart code cache instead. That code cache realistically only needs to store a trace of potential starting locations, then the JIT can compile the code before starting the game.

  • Yes, but in practice that isn't a problem. People do write self modifying code, and jump to random places today. However it is much less common today than in the past. IT is safe to say that most games are developed and run on the developers PC and then ported to the target system. If they know the target system they will make sure it works on the system from day one, but most developers are going to prefer to run their latest changes on their current system over sending it to the target system. If you really need to take advantage of the hardware you can't do this, but most games don't.

    Many games are written in a high level language (like C...) which doesn't give you easy access to self modifying code. (even higher level languages like python do, but they are not compiled and so not part of this discussion). Likewise, jumping to arbitrary code is limited to function calls for most programmers.

    Many games just run on a game engine, and the game engine is something we can port or rewrite to other systems and then enable running the game.

    Be careful of the above: most games don't become popular. It is likely the "big ticket games" people are most interested in emulating had the development budget and need to take advantage of the hardware in the hard ways. That is the small minority of exceptions are the ones we care about the most.

    • This is PS2 emulation, where most engines were still bespoke and every hack in the book was still on the table.

  • JIT isn't _that_ common in games (although it is certainly present in some, even from the PS2 era), but self-modifying or even self-referencing executables were a quite common memory saving trick that lingered into the PS2 era - binaries that would swap different parts in and out of disk were quite common, and some developers kept using really old school space-saving tricks like reusing partial functions as code gadgets, although this was dying out by the PS2 era.

    Emulation actually got easier after around the PS2 era because hardware got a little closer to commodity and console makers realized they would need to emulate their own consoles in the future and banned things like self-modifying code as policy (AFAIK, the PowerPC code segment on both PS3 and Xbox 360 is mapped read only; although I think SPE code could technically self-modify I'm not sure this was widespread)

    The fundamental challenges in this style of recompilation are mostly offset jump tables and virtual dispatch / function pointer passing; this is usually handled with some kind of static analysis fixup pass to deal with jump tables and some kind of function boundary detection + symbol table to deal with virtual dispatch.

  • How many PS2-era games used JIT? I would be surprised if there were many of them - most games for the console were released between 2000 and 2006. JIT was still considered a fairly advanced and uncommon technology at the time.

    • A lot of PS2-era games unfortunately used various self-modifying executable tricks to swap code in and out of memory; Naughty Dog games are notorious for this. This got easier in the Xbox 360 and PS3 era where the vendors started banning self-modifying code as a matter of policy, probably because they recognized that they would need to emulate their own consoles in the future.

      The PS2 is one of the most deeply cursed game console architectures (VU1 -> GS pipeline, VU1 microcode, use of the PS1 processor as IOP, etc) so it will be interesting to see how far this gets.

      2 replies →

    • I'd say practically none, we were quite memory starved most of the time and even regular scripting engines were a hard sell at times (perhaps more so due to GC rather than interpretation performance).

      Games on PS2 were C or C++ with some VU code (asm or some specialized hll) for most parts, often Lua(due to low memory usage) or similar scripting added for minor parts with bindings to native C/C++ functions.

      "Normal" self-modifying code went out of favour a few years earlier in the early-mid 90s, and was perhaps more useful on CPU's like the 6502s or X86's that had few registers so adjusting constants directly into inner-loops was useful (The PS2 MIPS cpu has plenty of registers, so no need for that).

      However by the mid/late 90s CPU's like the PPro already added penalties for self-modifying code so it was already frowned on, also PS2 era games already often ran with PC-versions side-by-side so you didn't want more than needed platform dependencies.

      Most PS2 performance tuning we did was around resources/memory, VU and helped by DMA-chains.

      Self modifying code might've been used for copy-protection but that's another issue.

  • I believe the main interest in recompilation is in using the recompiled source code as a base for modifications.

    Otherwise, yeah, a normal emulator JIT basically points a recompiler at each jump target encountered at runtime, which avoids the static analysis problem. AFAIK translating small basic blocks and not the largest reachable set is actually desirable since you want frequent "stopping points" to support pausing, evaluating interrupts, save states, that kind of stuff, which you'd normally lose with a static recompiler.

See also: XenonRecomp, which does the same thing for Xbox 360, and N64:Recompiled which does the same thing for N64.

Note that this "recompilation" and the "decompilation" projects like the famous Super Mario 64 one are almost orthogonal approaches in a way that the article failed to understand; this approach turns the assembly into C++ macros and then compiles the C++ (so basically using the C++ compiler as a macro re-assembler / emulation recompiler in a very weird way). The famous Super Mario 64 decompilation (and openrct and so on) use the output from an actual decompiler which attempts to reconstruct C from assembly, and then modify that code accordingly (basically, converting the game's object code back into some semblance of its source code, which this approach does NOT do).

Even with tooling isn't this kind of effort going to be enormously complicated, because, as I understand it, so many aspects of the behaviour of 8-bit and console games relied on all sorts of clever techniques, timing quirks and other side effects.

Won't it be very difficult for the recompilation process or the dev to recognise when these are being relied on and to match the key behaviour?

Or is the idea to pull out the basics of the game structure in a form that runs on modern hardware, then the dev fleshes out the missing parts.

  • From the last time I looked at these recompilation projects, they take the assembly and basically convert each opcode back into an llvm instruction and then recompiled from there. This comes with a lot of caveats though, last time I looked at it we still needed a function map of some kind to tell us where all the functions are and you still needed to replace chunks of the generated code afterwards for things like rendering, sound, controller input, and just about anything that would interact with the world outside of the cpu.

    Edit: After some reading on the github page, it seems they are translating it to c++ instead of using llvm directly, but the original idea still holds. They aren't decompiling it to c++ that looks like original source code, it more like they're converting it to c++ that emulates the processor and gets statically compiled.

    So it's not really just a drop and go replacement like it sounds like it'd be, but it has so far enabled the recompilation of multiple n64 games. This seems like an extension into the ps2 space.

    Side note: The ps2 is a 32bit console with a 64bit alu (and some 128bit simd)[1]. So a lot of the weird tricks from the 8bit days weren't really used here. Not that there aren't weird tricks, just things like using undocumented instructions and racing the beam are less prevalent. I could be wrong here, I was growing up at this time not making games. All of this is just from research I've done in the past. Someone feel free to correct me if I'm wrong.

    [1] https://en.wikipedia.org/wiki/PlayStation_2_technical_specif...

  • I suspect many PS2 games were mostly written in C or C++, which greatly simplifies things. Games that used the interface libraries from Sony for e.g. managing input will greatly simplify things as well.

    One notable exception is the Jak and Daxter games, which were written in GOAL[1], but have their own recompilation project.

    1: https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp

I think these kind of things will bring problems.

Because Nint€ndo or $ony (and others game companies) have a big problem, their old games are awesome and if the people can play these games, then the people will be happy and will not need new games or new sagas.

Because the problem is not the people playing old games, the real problem is the people will not pay for new games

And we know that these companies have army of lawyers (and "envelopes" to distribute among politicians) to change the laws and make illegal something that is not illegal.

  • I have very little sympathy for these companies. They make access to these titles difficult if not impossible. Preservation matters and they have no interest in anything but the bottom line today.

  • It’s interesting, because they could potentially make millions, if not billions by selling access to them as a subscription.

I have a samurai game, Kengo 3, that I really liked on PS2. I still have that CD at my parents'. Can anyone recommend me a PS2 emulator?

One thing I never understood on these recompilation projects is how static recomp translates the graphics between radically different architectures. Even with the supporting libraries, this translation shouldn't "just work"

This sounds very cool, but I can practically hear the IP lawyers sharpening their buzz-axes...

  • They haven't been all that aggressive against the decompile/recompile projects, interestingly. They're sometimes/often set up so you need the original to grab assets etc., but that code is copyrighted too and I'd have to imagine a decompile that purposely compiles to an identical binary would be a derivative work.

    My best guess is that for them it's not worth the hassle or any possibility of a negative result in court as long as people have to jump through some hoops by providing an original, and for the projects that don't do that, you have very straightforward easy infringement cases without even getting into the decomp stuff. Though really even ROMs seem to be tacitly tolerated to some extent lately. Maybe there's an attitude that keeping people involved with the franchise is worth it, again so long as it doesn't become too easy.

  • Sony have actually been fairly chill about emulators etc. so I'd be surprised if lawyers got involved here.

    They actually used an open source Playstation emulator when they released the "Playstation Classic" in 2018.

Interesting. There was a similar project that did this for JVM bytecode. Xmlvm iirc.

I’ve been meaning to start decompiling one of my favorite games of the era (Hulk Ultimate Destruction) after watching the decomp of other games. Perhaps this is a sign to start?

This is amazing for preservation. Being able to run these classics on modern hardware with native recompilation is a huge step forward.

My all time favorite console. I keep coming back to it. This to me is a fantastic way to preserve gaming history.

I've been working on decompiling Dance Central 3 with AI and it's been insane. It's an Xbox 360 game that leverages the Kinect to track your body as your dance. It's a great game, but even with an emulator, it's still dependent on the Kinect hardware which is proprietary and has limited supply.

Fortunately, a Debug build of this game was found on a dev unit (somehow), and that build does _not_ have crazy optimizations in place (Link-time Optimization) that make this feat impossible.

I am not somebody that is deep on low level assembly, but I love this game (and Rock Band 3 which uses the same engine), and I was curious to see how far I could get by building AI tools to help with this. A project of this magnitude is ... a gargantuan task. Maybe 50k hours of human effort? Could be 100k? Hard to say.

Anyway, I've been able to make significant progress by building tools for Claude Code to use and just letting Haiku rip. Honestly, it blows me away. Here is an example that is 100% decompiled now (they compile to the exact same code as in the binary the devs shipped).

https://github.com/freeqaz/dc3-decomp/blob/test-objdiff-work...

My branch has added over 1k functions now and worked on them[0]. Some is slop, but I wrote a skill that's been able to get the code quite decent with another pass. I even implemented vmx128 (custom 360-specific CPU instructions) into Ghidra and m2c to allow it to decompile more code. Blows my mind that this is possible with just hours of effort now!

Anybody else played with this?

0: https://github.com/freeqaz/dc3-decomp/tree/test-objdiff-work...

What makes this exciting isn't just "PS2 games on PC", we already have that via PCSX2. The big deal is moving from emulation to reconstruction

Whats the best PS2 game of all time?

Side note, are we at the level where tech blogs and news site can't even write <a href> links properly?

2 out of 4 links in the article are messed up, that's mind boggling... On a tech blog!

Is that how far deep we've sunk to assert it wasn't written by AI?

  • A more accurate version of the famous idiom:

    Those who can, do (and sometimes become teachers when they get older). Those who can’t become journalists.