I'm making a game engine that uses rollback netcode for its multiplayer architecture. As far as I can tell, no physics engine supports incremental rollback thus far. This means the entire physics engine state has to be snapshotted every frame, which basically means it's infeasible to have large worlds with rollback netcode. I've made a physics engine which only snapshots the changes, and so now I think you can have large worlds, as long as most of the world is static. I think that's true in most cases, like when you're walking around a big spaceship for example, all the walls, tables, control panels etc don't really move. I wrote up a bit of a post to describe some of the cool things I discovered while making my own physics engine.
This physics engine is a deterministic physics engine. It has to be to make sure that when it rolls back and resimulates forward, it gets the same answer on all machines.
The determinism is partly possible because WebAssembly is deterministic (except for a few known cases https://github.com/WebAssembly/design/blob/main/Nondetermini...), and partly because I’m making sure to use my own trigonometric functions, and the entire game simulation is executed single threaded with a known order of execution.
If you meant to ask how does it compare to non deterministic physics engines, I’m sure they might be faster on the physics but would be slower on the rollback, and I think on most reasonably-sized games the slow rollback would dominate and so they would be slower overall. But, you wouldn’t make a rollback netcode game with that size of world anyway, at least maybe not until now, so it’s a bit of a false comparison. They’re good at their different use cases.
Have you considered the opportunity of using delta compression on snapshots? Like the internal state of the physics simulation, most of the gamestate itself don't change between frames. Using delta compression on the whole structure is doable.
I was also curious about this, and I don't think the other replies understood what was being suggested.
If I understood correctly, the aim of the engine is to lower the in-memory size of the history of game states, by only snapshotting the delta. I'm also curious what would happen if, instead, you'd just run any deterministic snapshottable physics engine, and delta compressed the history on the fly. I think this is how, for example, Braid works.
Might be that it doesn't work, that running the delta check on two big enough snapshots would be too slow, and that's what this engine fixes. But would love to hear if it was considered.
The thing the author is trying to solve for here is reducing the amount of CPU used on the client when it rolls back the simulation and re-simulates to keep server authority.
He does this by only rolling back and re-simulating only a subset of the world, greatly reducing the amount of CPU required. It's cool that he's approaching this from the point of view of adding support for it in the physics engine itself, vs. making it something that the game has to do themselves.
Delta compression is an unrelated technique which reduce the amount of bandwidth sent from server to client, by sending only the differences between the snapshot at baseline frame n and the current snapshot frame m on the server.
Just want to clear this up for anybody trying to follow along. Bringing in delta compression is an unrelated thing (but somewhat similar conceptually). It might confuse people to talk about these things at the same time, if they're really just trying to understand what the author is doing in the article.
As a long-time web/app developer getting into game dev, it feels like I'm entering "the big leagues" of software engineering. Tougher problems, more problems, more _interesting_ problems, and problems without prebuilt solutions. Much more fun than making yet another dashboard.
Combining player control, multiplayer, non-player control, and physics is one of the tougher problems. I got it handled (enough) for my project, but I'd be very interested to read the source if Easel's physics engine gets open-sourced.
I’m in the same boat. Been learning 3d game dev past few years seriously after dabbling for half a decade. I even released my first few tiny 3d games in the last 6 months and they’ve made hundreds of dollars! That alone was a dream come true.
I picked game dev specifically because I wanted to build some things I had envisioned and found it challenging. And in the beginning each new concept within 3d modeling, optimisation, shaders, physics, lighting, shadows & rendering felt intractable and unmasterable.
Now, I have a basic working understanding of nearly everything that goes into traditional 3d game dev. Except the very cutting edge stuff. And have mastered things I was struggling with 2 years ago.
And recently I felt something that scared me. It was the feeling that within 2 years, I’ll have lost the excitement and challenges that learning game dev has brought me with these past few years. And I could see how to someone experienced this was as boring as full stack web dev was for me.
Fwiw, one case where I've wanted rollback has been input fusion over interface devices with diverse latencies. You might have 10 ms for a keypress, 100 ms for optical tracking, and 1000 ms for speech. So given click+"the red one"(spoken), you might start running click+"the one in front"(default), and almost a second later rollback and rerun with "the red one". Or for real example, keypress event handling might branch on optical "pressed where on the keycap" and "by which finger", which won't become available for several frames.
The export basically creates a page with an HTML IFRAME in it that embeds the hosted version of your game on easel.games so that all the multiplayer and leaderboards continue to work.
What you're describing isn't uploading the game, then, but uploading a stub with a transclusion of the game. I'm not the same commenter, but surely that doesn't answer (what I see as) most of the implications of the original question?
Hi everyone,
I'm making a game engine that uses rollback netcode for its multiplayer architecture. As far as I can tell, no physics engine supports incremental rollback thus far. This means the entire physics engine state has to be snapshotted every frame, which basically means it's infeasible to have large worlds with rollback netcode. I've made a physics engine which only snapshots the changes, and so now I think you can have large worlds, as long as most of the world is static. I think that's true in most cases, like when you're walking around a big spaceship for example, all the walls, tables, control panels etc don't really move. I wrote up a bit of a post to describe some of the cool things I discovered while making my own physics engine.
How big is the state that you want to rollback?
How does it compare with deterministic physics engines, given that their appeal for multiplayer is that they can perform rollback?
This physics engine is a deterministic physics engine. It has to be to make sure that when it rolls back and resimulates forward, it gets the same answer on all machines.
The determinism is partly possible because WebAssembly is deterministic (except for a few known cases https://github.com/WebAssembly/design/blob/main/Nondetermini...), and partly because I’m making sure to use my own trigonometric functions, and the entire game simulation is executed single threaded with a known order of execution.
If you meant to ask how does it compare to non deterministic physics engines, I’m sure they might be faster on the physics but would be slower on the rollback, and I think on most reasonably-sized games the slow rollback would dominate and so they would be slower overall. But, you wouldn’t make a rollback netcode game with that size of world anyway, at least maybe not until now, so it’s a bit of a false comparison. They’re good at their different use cases.
Have you considered the opportunity of using delta compression on snapshots? Like the internal state of the physics simulation, most of the gamestate itself don't change between frames. Using delta compression on the whole structure is doable.
I was also curious about this, and I don't think the other replies understood what was being suggested.
If I understood correctly, the aim of the engine is to lower the in-memory size of the history of game states, by only snapshotting the delta. I'm also curious what would happen if, instead, you'd just run any deterministic snapshottable physics engine, and delta compressed the history on the fly. I think this is how, for example, Braid works.
Might be that it doesn't work, that running the delta check on two big enough snapshots would be too slow, and that's what this engine fixes. But would love to hear if it was considered.
The thing the author is trying to solve for here is reducing the amount of CPU used on the client when it rolls back the simulation and re-simulates to keep server authority.
He does this by only rolling back and re-simulating only a subset of the world, greatly reducing the amount of CPU required. It's cool that he's approaching this from the point of view of adding support for it in the physics engine itself, vs. making it something that the game has to do themselves.
Delta compression is an unrelated technique which reduce the amount of bandwidth sent from server to client, by sending only the differences between the snapshot at baseline frame n and the current snapshot frame m on the server.
Just want to clear this up for anybody trying to follow along. Bringing in delta compression is an unrelated thing (but somewhat similar conceptually). It might confuse people to talk about these things at the same time, if they're really just trying to understand what the author is doing in the article.
cheers
- Glenn
11 replies →
I think this is a cool idea. Well done!
Thank you so much!
As a long-time web/app developer getting into game dev, it feels like I'm entering "the big leagues" of software engineering. Tougher problems, more problems, more _interesting_ problems, and problems without prebuilt solutions. Much more fun than making yet another dashboard.
Combining player control, multiplayer, non-player control, and physics is one of the tougher problems. I got it handled (enough) for my project, but I'd be very interested to read the source if Easel's physics engine gets open-sourced.
I’m in the same boat. Been learning 3d game dev past few years seriously after dabbling for half a decade. I even released my first few tiny 3d games in the last 6 months and they’ve made hundreds of dollars! That alone was a dream come true.
I picked game dev specifically because I wanted to build some things I had envisioned and found it challenging. And in the beginning each new concept within 3d modeling, optimisation, shaders, physics, lighting, shadows & rendering felt intractable and unmasterable.
Now, I have a basic working understanding of nearly everything that goes into traditional 3d game dev. Except the very cutting edge stuff. And have mastered things I was struggling with 2 years ago.
And recently I felt something that scared me. It was the feeling that within 2 years, I’ll have lost the excitement and challenges that learning game dev has brought me with these past few years. And I could see how to someone experienced this was as boring as full stack web dev was for me.
Does it run Doom?
Fwiw, one case where I've wanted rollback has been input fusion over interface devices with diverse latencies. You might have 10 ms for a keypress, 100 ms for optical tracking, and 1000 ms for speech. So given click+"the red one"(spoken), you might start running click+"the one in front"(default), and almost a second later rollback and rerun with "the red one". Or for real example, keypress event handling might branch on optical "pressed where on the keycap" and "by which finger", which won't become available for several frames.
That is a cool idea! Could be a great application of rollback!
Is this a demo? A hobby? How long have you been at this?
I’ve been working on this for four years. I’m trying to make this my full time job!
Can the games only be played on the easel site, or can I upload to e.g. itch.io as well (probably with the limitation of only singleplayer?)
You can upload to itch.io yes: https://easel.games/docs/learn/publishing/export
The export basically creates a page with an HTML IFRAME in it that embeds the hosted version of your game on easel.games so that all the multiplayer and leaderboards continue to work.
Thanks for your interest!
What you're describing isn't uploading the game, then, but uploading a stub with a transclusion of the game. I'm not the same commenter, but surely that doesn't answer (what I see as) most of the implications of the original question?
2 replies →