← Back to context

Comment by ehejsbbejsk

5 years ago

As a hobby photographer, this is simply amazing and the most intuitive article I’ve come across. This is a must read.

I am curious, however, why we still can’t digitally reproduce bokeh. Apple is getting close. I thought LiDAR would theoretically solve that and could yield indistinguishable renders compared to analog lenses. That would be a game changer in my view and why I would like to see Apple develop a full-frame sensor coupled with their technology.

A large lens captures information over an area, and so to a certain extent can "see around" out of focus objects. A selective blur of a fully focused scene captured from a single viewpoint (i.e. a small lens) can only approximate this effect, because it simply doesn't have access to the same information. Even with a perfect depth map, you still don't know what's behind occluded objects.

  • If instead of resolving points of light on the image sensor, you use a group of pixels to resolve an entire tiny image you can effectively also see around things. You end up with a picture of many small sections of the large image each at a different angle. The image on the far left of the image would see a different angle than the image on the far right. This is exactly what the Lytro camera did and it's why you can take the picture first and focus later. Of course you sacrifice overall imagine resolution quite severely:

    * https://www.researchgate.net/figure/a-b-The-first-and-second...

One of the limiting factors for modeling bokeh and flare-like effects is dynamic range limitation. You need extreme HDR capturing to accurately reproduce these effects, as they often play the largest part with bright, especially colored, light sources. I did work on flare simulation and while many effects can be modeled by a rather simple convolution (in the spectral space of course -- you cannot make a rainbow out of RGB straightforwardly), the problem is that kernels (PSFs, point spread functions) for these convolutions have very long tails and it's the shape of these tails that gives most of the 'natural' artistic feel.

The thing is, these tails become apparent only when you convolve with a very very bright source -- which on a typical 12-bit level linear raw image would amount to something like 10⁵-10⁶, i.e. needing 4-8 additional bits of HDR.

Here are some useful links on the topic of flare simulation, I believe bokeh has mahy similar aspects:

https://web.archive.org/web/20200119024053/http://simonwinde...

http://resources.mpi-inf.mpg.de/hdr/TemporalGlare/

Phone cameras are extremely wide-angle so everything is in focus and there is no natural bokeh. To add bokeh, you have to separate the subject from the background, and then also determine how far different parts of the background are. This requires very advanced AI for non-trivial images (see the imperfections in Photoshop's "select subject" tool), which Apple is actually still doing (that's what portrait mode is). But if it's not perfect, it quickly becomes worthless, so in short, they are doing it, but only the most advanced companies can try.

Not sure but - as film e.g. captures actual photons from the scene, probably some kind of information is encoded through that.

Bokeh is a kind of of space representation, similar to how you can basically "see" through hearing a sound stage of instruments separated properly when someone has a really good sound system, or how dogs have "5.1/7.1" sense of smell.

How does one encode that I have no idea.

  • Look into light field photography tech. It is possible to capture a ”volume of light”, within which bokeh & more can be adjusted after the fact. Issue is the amount of data generated and complexity of tech versus getting a ”good enough for most situations” image via simpler means (regular photo). Regular + depth images (Apple LiDAR etc) with help of AI can create something vaguely similar to actual bokeh, but they’re missing a lot of source data.

    In the world of 3D rendering (content created from scratch) very advanced & realistic bokeh effects are possible, as an example see http://lentil.xyz for the Arnold renderer.

    • Wow. I left the CG industry 3 years ago in a sad bout of defeat, involving both an ability to make a decent living and a realization that it would never my standards of creative engagement that were set by my lifetime love of photography and film. But, this project is very cool.