← Back to context

Comment by m132

10 hours ago

>Before HD, almost all video was non-square pixels

Correct. This came from the ITU-R BT.601 standard, one of the first digital video standards authors of which chose to define digital video as a sampled analog signal. Analog video never had a concept of pixels and operated on lines instead. The rate at which you could sample it could be arbitrary, and affected only the horizontal resolution. The rate chosen by BT.601 was 13.5 MHz, which resulted in a 10/11 pixel aspect ratio for 4:3 NTSC video and 59/54 for 4:3 PAL.

>SD channels on cable TV systems are 528x480

I'm not actually sure about America, but here in Europe most digital cable and satellite SDTV is delivered as 720x576i 4:2:0 MPEG-2 Part 2. There are some outliers that use 544x576i, however.

Good post. For anyone wondering "why do we have these particular resolutions, sampling and frame rates, which seem quite random", allow me to expand and add some color to your post (pun intended). Similar to how modern railroad track widths can be traced back to the wheel widths of roman chariots, modern digital video standards still reverberate with echoes from 1930s black-and-white television standards.

BT.601 is from 1982 and was the first widely adopted analog component video standard (sampling analog video into 3 color components (YUV) at 13.5 MHz). Prior to BT.601, the main standard for video was SMPTE 244M created by the Society of Motion Picture and Television Engineers, a composite video standard which sampled analog video at 14.32 MHz. Of course, a higher sampling rate is, all things equal, generally better. The reason for BT.601 being lower (13.5 MHz) was a compromise - equal parts technical and political.

Analog television was created in the 1930s as a black-and-white composite standard and in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs. Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded analog video for over 50 years. The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc). For example, 720 is tied to 13.5 Mhz because sampling the active picture area of an analog video scanline at 13.5 MHz generates 1440 samples (double per-Nyquist). Similarly, 768 is tied to 14.32 MHz generating 1536 samples. VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.

  • > Similar to how modern railroad track widths can be traced back to the wheel widths of roman chariots

    This is repeated often and simply isn't true.

    • I based that on seeing the BBC Science TV series (and books) Connections by science historian James Burke. If it's been updated since, then I stand corrected. Regardless of the specific example, my point was that sometimes modern standards are linked to long-outdated historical precedents for no currently relevant reason.

Here's some captures from my Comcast system here in Silicon Valley.

https://www.w6rz.net/528x480.ts

https://www.w6rz.net/528x480sp.ts

My DVCAM equipment definitely outputs 720x576i, although whether that's supposed to render to 768x576, or 1024x576 for 16:9 stuff.

It still looks surprisingly good, considering.