These examples are amazing, though compute heavy and built with WebGL, which is less than an ideal fit for it. This website and project have been around for a couple of years, and the web's graphics capabilities have grown since, bumped up significantly by the introduction of WebGPU. And since Firefox introduced support for it in 141 on Windows, and 145 on macOS (see wiki tracking implementation here: https://github.com/gpuweb/gpuweb/wiki/Implementation-Status), it now also enjoys broad cross-platform support (Windows, macOS, Android, iOS, with Linux trailing a bit behind).
After this experiment (and some that are in progress), and some very recent movement on raising the bound storage buffer limit in Chrome (https://issues.chromium.org/issues/366151398), I can't help but feel that we're on the cusp of an AAA-level experience built exclusively on the web. I'm super excited for the future of computer graphics right in your browser.
So nice to see another person being that enthusiastic about ray tracing! I didn't do a comparable level of work in this field, but as a hobby this fascinates me a lot!
One common misconception is that ray tracing is computationally prohibitive. It was, but no longer so; it's a target within our reach, especially so when there's GPU with hardware acceleration for ray casting.
Many games use ray tracing for partial scene processing, and of course they all work in real time. My favourite example is Metro Exodus with ray traced global illumination, which works on last gen graphics hardware pretty well. Not all games use the technology efficiently, but the trend is already obvious: with accessible real time ray tracing rendering the scene will become a much easier task.
P.S. I used "ray tracing" when more accurately I should have used "path tracing", but I prefer to use a single term to encompass the whole technology with all its variants.
This is neat. In the demos I would suggest making mouse/finger drag orbit the camera around the scene instead of panning. Panning can be done by a 2D image transformation so it doesn't show off the 3D nature of the renderer.
I second the vote for orbit cam! Add double-click to choose the orbit point, and add a zoom control that is proportional to distance to orbit point, and it suddenly gets insanely easy to navigate the scene and find good views. It’s too hard to control using translate and look-around angles.
The demos I tried so far have translate and not pan, and those are fully 3d…
This is impressive. Not quite fast or smooth enough for real time lighting, but how possible would it be to use this in conjunction with traditional rasterisation for GI??
Pretty great demos, and they do indeed run well on my phone; I suspected it might be an AI thing because of the tautology in the title, but it seems hand written.
Particularly cool is the recreation of that classic scene from Kajiya's rendering equation paper, with the glass spheres and caustics.
These examples are amazing, though compute heavy and built with WebGL, which is less than an ideal fit for it. This website and project have been around for a couple of years, and the web's graphics capabilities have grown since, bumped up significantly by the introduction of WebGPU. And since Firefox introduced support for it in 141 on Windows, and 145 on macOS (see wiki tracking implementation here: https://github.com/gpuweb/gpuweb/wiki/Implementation-Status), it now also enjoys broad cross-platform support (Windows, macOS, Android, iOS, with Linux trailing a bit behind).
I've recently written about another compute heavy global illumination approach, which is all but impossible to pull off using WebGL: https://juretriglav.si/surfel-based-global-illumination-on-t...
After this experiment (and some that are in progress), and some very recent movement on raising the bound storage buffer limit in Chrome (https://issues.chromium.org/issues/366151398), I can't help but feel that we're on the cusp of an AAA-level experience built exclusively on the web. I'm super excited for the future of computer graphics right in your browser.
So nice to see another person being that enthusiastic about ray tracing! I didn't do a comparable level of work in this field, but as a hobby this fascinates me a lot!
One common misconception is that ray tracing is computationally prohibitive. It was, but no longer so; it's a target within our reach, especially so when there's GPU with hardware acceleration for ray casting.
Many games use ray tracing for partial scene processing, and of course they all work in real time. My favourite example is Metro Exodus with ray traced global illumination, which works on last gen graphics hardware pretty well. Not all games use the technology efficiently, but the trend is already obvious: with accessible real time ray tracing rendering the scene will become a much easier task.
P.S. I used "ray tracing" when more accurately I should have used "path tracing", but I prefer to use a single term to encompass the whole technology with all its variants.
This is neat. In the demos I would suggest making mouse/finger drag orbit the camera around the scene instead of panning. Panning can be done by a 2D image transformation so it doesn't show off the 3D nature of the renderer.
Revisiting this and I have to say the sheer number of demos on this page is incredible. I love this one in particular as I remember this classic scene from Veach's thesis: https://erichlof.github.io/THREE.js-PathTracing-Renderer/Bi-...
If you have a good GPU the default parameters underutilize it. This demo lets you crank up the "samples per pixel" parameter to get good quality in real time: https://erichlof.github.io/THREE.js-PathTracing-Renderer/Mul...
I second the vote for orbit cam! Add double-click to choose the orbit point, and add a zoom control that is proportional to distance to orbit point, and it suddenly gets insanely easy to navigate the scene and find good views. It’s too hard to control using translate and look-around angles.
The demos I tried so far have translate and not pan, and those are fully 3d…
I recently wrote one in WebGPU, too
https://github.com/ivanjermakov/moonlight
Reminds me of the old POV-Ray stuff I did in the early-1990s. But... in realtime and in my browser. WTF!
nice historical recreations, but I can't believe there was no '1984' image from Thomas Porter there https://graphics.pixar.com/library/DistributedRayTracing/ind... (and I can definitely remember few more!)
This is impressive. Not quite fast or smooth enough for real time lighting, but how possible would it be to use this in conjunction with traditional rasterisation for GI??
Pretty great demos, and they do indeed run well on my phone; I suspected it might be an AI thing because of the tautology in the title, but it seems hand written.
Particularly cool is the recreation of that classic scene from Kajiya's rendering equation paper, with the glass spheres and caustics.
There is another that is also quite mature that will render most Three.js scenes:
https://github.com/gkjohnson/three-gpu-pathtracer
Demos here:
https://gkjohnson.github.io/three-gpu-pathtracer/example/bun...
This one is also an official Three.js example:
https://threejs.org/examples/?q=path#webgl_renderer_pathtrac...
Damn, that's really impressive.
It's very interesting and I'm also impressed that most of the demoes run on my potato-phone.
Lots of cool demos.
Huh. I've seen space/shift-or-ctrl, Z/X, and Q/E for up/down movement... but never Q/Z
[dead]