Comment by Sharlin

5 days ago

Pixel, used as a unit of horizontal or vertical resolution, typically implies the resolution of the other axis as well, at least up to common aspect ratios. We used to say 640x480 or 1280x1024 – now we might say 1080p or 2.5K but what we mean is 1920x1080 and 2560x1440, so "pixel" does appear to be a measure of area. Except of course it's not – it's a unit of a dimensionless quantity that measures the amount of something, like the mole. Still, a "quadratic count" is in some sense a quantity distinct from "linear count", just like angles and solid angles are distinct even though both are dimensionless quantities.

The issue is muddied by the fact that what people mostly care about is either the linear pixel count or pixel pitch, the distance between two neighboring pixels (or perhaps rather its reciprocal, pixels per unit length). Further confounding is that technically, resolution is a measure of angular separation, and to convert pixel pitch to resolution you need to know the viewing distance.

Digital camera manufacturers at some point started using megapixels (around the point that sensor resolutions rose above 1 MP), presumably because big numbers are better marketing. Then there's the fact that camera screen and electronic viewfinder resolutions are given in subpixels, presumably again for marketing reasons.

Digital photography then takes us on to subpixels, Bayer filters (https://en.wikipedia.org/wiki/Color_filter_array) and so on. You can also divide the luminance colour parts out. Most image and video compression puts more emphasis on the luminance profile, getting the colour more approximate. The subpixels on a digital camera (or a display for that matter) take advantage of this quirk of human vision.