Physically Based Rendering: From Theory to Implementation

5 days ago (pbr-book.org)

Every time I see stuff like this it makes me think about optical design software.

There are applications (Zemax, for example) that are used to design optical systems (lens arrangements for cameras, etc). These applications are eye-wateringly expensive-- like similar in pricing to top-class EDA software licenses.

With the abundance GPU's and modern UI's, I wonder how much work would be involved for someone to make optical design software that blows away the old tools. It would be ray-tracing, but with interesting complications like accounting for polarization, diffraction, scattering, fluorescence, media effects beyond refraction like like birefringence and stuff like Kerr and Pockels, etc.

  • This, very much this!

    I do research in a subfield of optics called nonimaging optics (optics for energy transfer, e.g. solar concentrators or lighting systems). We typically use these optical design applications, and your observations are absolutely correct. Make some optical design software that uses GPUs for raytracing, reverse-mode autodiff for optimization, sprinkle in some other modern techniques you may blow these older tools out of the water.

    I am hoping to be able to get some projects going in this direction (feel free to reach out if anyone are interested).

    PS: I help organize an academic conference my subfield of optics. We run a design competition this year [1,2]. Would be super cool if someone submits a design that they made by drawing inspiration from modern computer graphics tools (maybe using Mitsuba 3, by one of the authors of this book?), instead of using our classical applications in the field.

    [1] https://nonimaging-conference.org/competition-2025/upload/

  • This is one example of an area where economic incentives make it difficult to shift.

      - There aren't that many people willing to pay for such software, but those that do *really* need it, and will pay quite a bit (passing that cost on of course). 
     
      - The technical domain knowledge needed to do it properly is a barrier to many
     
      - It needs to be pretty robust
    

    As a result, you end up with a small handful of players who provide it. They have little incentive to modernize, and the opportunity cost for a new player high enough to chase most of them off to other avenues.

    I think the main way this changes is when someone has already spend the money in an adjacent area, and realized "huh, with a little effort here we could probably eat X's lunch"

    Beyond that you at most get toy systems from enthusiasts and grad students (same group?) ...

  • You’d be surprised! Everywhere I’ve worked, academic or industry, typically writes their own simulation software. Sometimes it’s entirely handwritten (i.e., end-to-end, preprocessing to simulation to evaluation), sometimes it’ll leverage a pre-existing open source package. I imagine this will become more and more common if, for no other reason, you can’t back-propagate an OpticStudio project and open source automatic differentiation packages are unbeatable.

  • I'd imagine there is fairly wide gap between having a simulation engine core and an useful engineering application

    From academic side, I've found the work of Steinberg in this area extremely impressive. They are pushing the frontier to include more wave-optical phenomenon in the rendering. E.g. https://ssteinberg.xyz/2023/03/27/rtplt/

  • > eye-wateringly expensive

    For you. Anyone doing design and manufacturing of optics will not blink at paying for software.

  • Well, everyone who can build this niche software is already employed to build it.

    • I think you’re overthinking this, e.g., Zemax’s optimization isn’t that different than the ray-tracing presented in this book. The sophistication truly comes from the users.

    • Yeah, perhaps.

      But the heavy-hitters in this field all seem to have very old-timey UI's and out-of-this-world pricing.

      Meanwhile, raytracing for computer graphics on GPU's is soooo performant-- it makes me wonder how much work needs to be done to make the equivalent of KiCAD for optical design.

      1 reply →

  • " I wonder how much work would be involved for someone to make optical design software that blows away the old tools"

    Depending on use case, it already exists for gem cutters. We can simulate everything from RI to fluorescence excitation.

  • I predict PBR is going to fall to neural rendering. Diffusion models have been shown to learn all of the rules of optics and shaders, and they're instructable and generalizable. It's god mode for physics and is intuitive for laypersons to manipulate.

    We're only at the leading edge of this, too.

    • Can you link the neural rendering animation you're talking about with some info on the total rendering times without any precomputation?

I love this book so much. The literate programming style I think inspired from Knuth's cweb, great writing, beautiful high-quality physical book worth buying but also free access to knowledge. The literate-programming style means you are almost immediately applying theory to a practical system, I keep having to take breaks to learn outside math/physics though, but it is self-contained in principle I think.

All right, off topic but I've seen this a bunch lately and the term just really irritates my brain for some reason. What's its origin? "[adverb] based" just feels so wrong to me. Shouldn't that be a noun: "Evidence-based medicine", "values-based", "faith-based", etc. Does "physically based" bother anyone else?

  • It is a bit of a silly term. It was made mostly to contrast against the somewhat more adhoc methods that preceded it. The new technique was parameterized more around physical properties where the older ones were more about visual descriptions.

    This paper from Disney is what kicked off the movement https://disneyanimation.com/publications/physically-based-sh...

    • Note that the book is even older than that – I remember first reading it in 2009; apparently the 1st edition was in 2004!

  • But what alternative can you suggest which doesn’t break grammar or usage precedents like “physically based”?

    Physics-based? Reality-based? Physically-derived?

    • Physics-based sounds perfectly fine to me.

      "X-based" to me is equivalent with "based on X". Physics-based = based on physics, evidence-based = based on evidence, values-based = based on values; all perfectly fine.

      Physically based feels correct in a sentence like "Our company is physically based in New York but we operate world-wide". But what does the "physically based" in "physically based rendering" mean?

      But I'm not a native speaker, what do I know.

  • It bothers me too, but I’m French. I always assumed it was some corner of the language I don’t know

This and https://hint.userweb.mwn.de/understanding_mp3/index.html are both amazingly fun examples of literate programming that I recommend all of the time.

I've mentioned it before, but this book is amazing in the way it covers both the theory in detail, as well as the implementation.

There's often a lot of details that matter when implementing something efficiently and well which the theory either hides or is a result of our hardware limitations like floating point numbers.

Almost all other programming books I've read cover either the theory in detail and gloss over the implementation details, or goes into a lot of implementation stuff but only cover the easy parts and doesn't give you a good indication of how to deal with the advanced stuff.

So, any other books out there that you've read that is like PBR?

off topic: I live in this strange world where I can read the code and understand what it does, regardless of the language(!)

but the algorithm for the theory looks approximately like this to me

    (ijfio) $@ * foiew(qwdrg) awep2-2lahl [1]

Is there a good book that goes FROM programmer TO math-wizardry that you would recommend, without falling into the textbooks? Ideally I'd like to be able to read them and turn them into pseudocode, or I can't follow.

[1]: e.g. https://pbr-book.org/4ed/Introduction/Photorealistic_Renderi...

  • The text seemed to describe it quite well. They just use a bunch of physics jargon because the book approaches rendering from the physics side of things.

    Light leaving a point in a direction = light emitted from that point in that direction (zero if we aren't talking about a light bulb or something) plus all light reflected by that point in that direction.

    • Sure, but I get lost at the notations like big S, `|` and other things. Those notations seem to have many standards or I just can't see to follow.

      In pseudocode, or any programming language, I'm right there with you.

      1 reply →

  • As a mathematician by training who does a lot of programming for a living: This is the biggest misconception about math I see coming from programmers. There's frequently a complaint about notation (be it that it is too compact, too obscure, too gatekept, whatever) and the difficulty in picking out what a given "line" (meaning equation or diagram or theorem, or whatever) means without context.

    Here's the thing though: Mathematics isn't code! The symbols we use to form, say, equations, are not the "code" of a paper or a proof. Unless you yourself are an expert at the domain covered by the paper, you're unlikely to be able to piece together what the paper wants to convey from the equations alone. That's because mathematics is written down by humans for humans, and is almost always conveyed as prose, and the equations (or diagrams or whatever) are merely part of that text. It is way easier to read source code for a computer program because, by definition, that is all the computer has to work with. The recipient of a mathematical text is a human with mathematical training and experience.

    Don't interpret this as gatekeeping. Just as a hint that math isn't code, and is not intended to be code (barring math written as actual code for proof assistants, of course, but that's a tiny minority).

    • Fantastic read, thank you. I was never explained it this way, nor imagined it this way.

      I'll try keep an open mind and read more the surrounding content.

      That said, there is a syntactic language, which many equations use, and they seem to vary or be presented differently. The big S is one, the meaning of `|` which is probably not "OR" etc. I wish there would be a cheat sheet of sorts, but I feel it would be death by a thousand paper cuts with 300 standards of doing things(?)

    • Maybe you're right and maybe that's the culture in mathematics but we don't have to like it.

  • Screenshot and ask ChatGPT. Works pretty well.

    • +1 – often, for me, since a lot of the computations are estimated anyway, the biggest thing I need to do is ask ChatGPT to break down the notation and give me the precise terminology.

Is there a standard OpenGL (ES3) shader I can drop in a GPLed application that uses a decent (is there a standard?) BRDF similar to Schlick with red,green,blue, and roughness?

I've wanted to add this capability to Solvespace for a while. I did my own implementation with the Fresnel term and it looked OK, but I want something "standard" and correct.

  • There is no standard.

    Filament is extremely well documented: https://google.github.io/filament/Filament.html

    glTF's PBR stuff is also very well documented and aimed at realtime usage: https://www.khronos.org/gltf/pbr/

    OpenPBR is a newer, much more expensive BRDF with a reference implementation written in MaterialX that iirc can compile to glsl https://academysoftwarefoundation.github.io/OpenPBR

    Pick one of the BRDFs, add IBL https://bruop.github.io/ibl, and you'll get decent results for visualization applications.

  • I think most realtime production uses something similar to Disney BRDF when they refer to PBR.

    https://media.disneyanimation.com/uploads/production/publica...

    I don't think there is a standard as such that would dominate the industry. It's all approximations to give artists parameters to tweak.

    IMHO - Being an ex professional CAD person I think PBR is the wrong visual style for most cad though. The main point of shading is to improve the perception of shape and part isolation there. Traditional airbrush based engineering illustration styles are much better reference. So something like Gooch with depth buffer unsharp masking IMHO would be much much more appropriate.

    • >> Being an ex professional CAD person I think PBR is the wrong visual style for most cad though. The main point of shading is to improve the perception of shape and part isolation there. Traditional airbrush based engineering illustration styles are much better reference. So something like Gooch with depth buffer unsharp masking IMHO would be much much more appropriate.

      That is some excellent feedback. I like that our current simple Snells law shader doesn't add specular highlights and can do flat shading by turning off all but the ambient light. I had never heard of "Gooch" and I don't recommend searching for it without "Gooch in the context of rendering". That looks interesting and we already have code to draw the silhouette edges.

      I do think adding a PBR based Fresnel component with a backlight gives a similar but less sharp delineation of edges of objects, but it tends toward white right on the edge.

      While people like a good PBR render, I agree that in CAD the objective of the rendering is to clearly convey the geometry AND the sketching tools and entities the user is manipulating.

  • There's no standard and even moving things from one 3D program to another can be a painful process if someone else didn't already build in some sort of conversation.

  • Even if you have a standard specular & diffuse model, indirect lighting is really important and you can't really just drop that into a shader.

    If you want a standard I've go with OpenPBR, it is well documented and looks nice. Just skip the extra layers to begin with(fuzz/coat etc).

Why don't they link to the physical book?