Comment by justin_

5 days ago

> A camera does not take point sample snapshots, it integrates lightfall over little rectangular areas.

Integrates this information into what? :)

> A modern display does not reconstruct an image the way a DAC reconstructs sounds

Sure, but some software may apply resampling over the original signal for the purposes of upscaling, for example. "Pixels as samples" makes more sense in that context.

> It is pretty reasonable in the modern day to say that an idealized pixel is a little square.

I do agree with this actually. A "pixel" in popular terminology is a rectangular subdivision of an image, leading us right back to TFA. The term "pixel art" makes sense with this definition.

Perhaps we need better names for these things. Is the "pixel" the name for the sample, or is it the name of the square-ish thing that you reconstruct from image data when you're ready to send to a display?

> Integrates this information into what? :)

Into electric charge? I don’t understand the question, and it sounds like the question is supposed to lead readers somewhere.

The camera integrates incoming light into a tiny square into an electric charge and then reads out the charge (at least for a CCD), giving a brightness (and with the Bayer filter in front of the sensor, a color) for the pixel. So it’s a measurement over the tiny square, not a point sample.

  • > The camera integrates incoming light into a tiny square [...] giving a brightness (and with the Bayer filter in front of the sensor, a color) for the pixel

    This is where I was trying to go. The pixel, the result at the end of all that, is the single value (which may be a color with multiple components, sure). The physical reality of the sensor having an area and generating a charge is not relevant to the signal processing that happens after that. For Smith, he's saying that this sample is best understood as a point, rather than a rectangle. This makes more sense for Smith, who was working in image processing within software, unrelated to displays and sensors.

    • It’s a single value, but it’s an integral over the square, not a point sample. If a shine a perfectly focused laser very close to the corner of one sensor pixel, I’ll still get a brightness value for the pixel. If it were a point sample, only the brightness at a single point would give an output.

      And depending on your application, you absolutely need to account for sensor properties like pixel pitch and color filter array. It affects moire pattern behavior and creates some artifacts.

      I’m not saying you can’t think of a pixel as a point sample, but correcting other people who say it’s a little square is just wrong.

      1 reply →

    • It's never a point source. Light is integrated over a finite area to form a singke color sample. During Bayer mosaicking, contributions from neighbouring pixels are integrated to form samples of complementary color channels.

      3 replies →