← Back to context

Comment by _ph_

5 years ago

You are right, this is a challenge. The wavelength of the light cannot be measured directly, only inferred by the intensity of the pixels with the different color filters. On the other side, most reproductions of photos are reproducing the original frequencies either. A computer screen has red, green and blue dots, which produce light at the corresponding wave lenghts. So if you have orange light, you have a signal on the green and red pixels and the green and red dots on your screen will light up. Which will be detected by the sensors for red and green light in your eyes. No where in the chain, not even in your eye, is a sensor for "orange" directly, it is just the mixture of the red and green sensitivity.

It is important to note, that neither the sensor pixels nor your eyes have a complete separate reaction to a wavelength. The sensitive area strongly overlaps. So for hues of green which have rather long wavelengths, you get some reaction on the red pixels, which gets stronger as you move towards orange, where both red and green pixels detect until it gets more and more red and less green. The exact absorption curve of the sensore color filters matters here, that is one reason, different manufacturers have slightly different color rendition. On top of that is calibration, when converting the raw image into a proper RGB-image, one can further balance the response. For that, the color calibration targets are used, which have like 24 patches of different colors. Taking a phone of this target, the calibration software can both calibrate for the light illuminating the target as well as the color response of your camera.

A common reason for red-green colorblindness is that the affected persons have the sensitivity between the red and green colors overlapping too strongly, so they loose the ability to differentiate. A green creates almost as strong a signal in the "red" cells. A way to improve the color vision for those people are glasses which increase that separation by absorbing the frequencies between the red and green colors.

> The wavelength of the light cannot be measured directly, only inferred by the intensity of the pixels with the different color filters.

Well, depending how you think "directly" but you can get pretty far with spectrometer, i.e. device that splits light and measures the intensity spatially to collect a spectrum. It's not impossible thought to build camera based on that principle, just need to sample the light in an array to make pixels

  • I was talking here about the typical foto cameras. Of course you can measure the wavelength of the light with other devices like spectrometers. I was specifically talking about camera sensors which have separate filters in usually 3 colors in front of the pixels. The sensors, which are made by Sigma, formerly Foveon, use a different principle. They determine the wavelength by measuring how deep electrons are generated by the photons in the silicon. The depth depends on the wavelength of the light. However, it is more difficult to get a precise color response that way as you cannot just use predefined color-filters.