← Back to context

Comment by m132

4 hours ago

This is indeed similar in the effects, but completely different in the cause to the phenomenon referenced in the article (device pixel ratio vs pixel aspect ratio).

What you're referring to stems from an assumption made a long time ago by Microsoft, later adopted as a de facto standard by most computer software. The assumption was that the pixel density of every display, unless otherwise specified, was 96 pixels per inch [1].

The value stuck and started being taken for granted, while the pixel density of displays started growing much beyond that—a move mostly popularized by Apple's Retina. A solution was needed to allow new software to take advantage of the increased detail provided by high-density displays while still accommodating legacy software written exclusively for 96 PPI. This resulted in the decoupling of "logical" pixels from "physical" pixels, with the logical resolution being most commonly defined as "what the resolution of the display would be given its physical size and a PPI of 96" [2], and the physical resolution representing the real amount of pixels. The 100x100 and 200x200 values in your example are respectively the logical and physical resolutions of your screenshot.

Different software vendors refer to these "logical" pixels differently, but the most names you're going to encounter are points (Apple), density-independent pixels ("DPs", Google), and device-independent pixels ("DIPs", Microsoft). The value of 96, while the most common, is also not a standard per se. Android uses 160 PPI as its base, Apple has for a long time used 72.

[1]: https://learn.microsoft.com/en-us/archive/blogs/fontblog/whe...

[2]: https://developer.mozilla.org/en-US/docs/Web/API/Window/devi...

Why does the PPI matter at all? Thought we only cared about the scaling factor. So 2 in this 100 to 200 scenario. It's not like I'm trying to display a true to life gummy bear on my monitor, we just want sharp images.

I might be misunderstanding what you're saying, but I'm pretty sure print and web were already more popular than anything Apple did. The need to be aware of output size and scale pixels was not at all uncommon by the time retina displays came out.

From what I recall only Microsoft had problems with this, and specifically on Windows. You might be right about software that was exclusive to desktop Windows. I don't remember having scaling issues even on other Microsoft products such as Windows Mobile.

  • Print was always density-independent. This didn't translate into high-density displays, however. The web, at least how I remember it, for the longest time was "best viewed in Internet Explorer at 800x600", and later 1024x768, until vector-based Flash came along :)

    If my memory serves, it was Apple that popularized high pixel density in displays with the iPhone 4. They weren't the first to use such a display [1], but certainly the ones to start a chain reaction that resulted in phones adopting crazy resolutions all the way up to 4K.

    It's the desktop software that mostly had problems scaling. I'm not sure about Windows Mobile. Windows Phone and UWP have adopted an Android-like model.

    [1]: https://en.wikipedia.org/wiki/Retina_display#Competitors