← Back to context

Comment by adornKey

5 days ago

Microsoft did a lot of great work on Fonts in the past. Recently it looked like they abandoned per monitor subpixel-rendering?! In which direction are they heading?

Pixel density continues to rise, but Microsoft might be engaged in… premature de-optimization?

It’s duals/mirrors all the way down. Or up.

  • But the angular resolution of the eye doesn't rise. For a desktop monitor 100 ppi practically already reached the limits. Anything beyond that is just additional burden for the GPU and a waste of bandwidth. Surely you can increase resolution just to make font rendering easier, but you also have to pay the price in energy consumption or speed - without any visible improvement.

    • At the traditional 96 dpi, you have to be 3 ft away to exceed the retinal density. Personally, I sit at half that distance. Something around 200 would be more ideal. Laptops you might sit even closer.

      Mobile devices, unless you get really close to the screen, have matched the retinal density for a while. Most people hold the device at about 8 inches, so 450 dpi is the value to hit.

      Edit These measurements assume 20:20 vision, which is the average. Many people exceed that. So you'd need slightly higher values if you're feeling pedantic.

      1 reply →

    • The difference between my 27" 4k and 1440p screens is still quite obvious and I don't consider myself particularly sensitive to these things.

      For rendering of text/video even an underpowered integrated gpu can handle it fine, only issue is using a bunch more ram.

      For reference my very underpowered desktop AMD igpu on 3 generations old gpu architecture (2CUs of RDNA 2) only has trouble with the occasional overly heavy browser animation

    • A few years ago, I replaced my 24" 1080p monitors (~96 ppi) with 27" 4k monitors (~157 ppi), and the increased pixel density was very noticeable, and I'd probably notice an increase over that. I sit about 3 feet away from them.

    • 300 ppi matches printed books which looks nice. On notebook computers having a 3840x2160 panel might not be worth the reduced battery life.

I hate subpixel rendering. It's impossible to turn it off for displays that don't need it. It looks absolutely awful. I wish it was never invented.

  • Hate seems a bit strong for an increase in perceived horizontal resolution on low DPI displays, but to each their own. That said, I'm not sure what you mean by it being impossible to turn off. On Windows you can just disable ClearType per monitor, and on Linux it's configurable either through your DE, fontconfig, or sometimes at the application level.

    MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.

    • > That said, I'm not sure what you mean by it being impossible to turn off.

      You can try to configure it to be off, and while that almost works, many applications will still simply not respect the setting. This is particularly apparent (and infuriating) with apps that don't render in high-resolution mode, because their rendering then no longer has anything to do with actual subpixels.

      I imagine this behavior came from ClearType having been a special case, and therefore non-native widget toolkits getting explicitly programmed to render with it on Windows, forgetting that the user should be able to turn it off!!

      > MacOS went the other direction and removed subpixel rendering entirely, which is partly why low DPI external displays tend to look worse there.

      Subpixel antialiasing is a compromise. Once every Mac shipped with a Retina display, there was no need to retain that compromise, because you already get high resolution so you may as well get color accuracy too.

      I will note macOS still enables by default a feature called "stem darkening" (incorrectly called "font smoothing" in macOS Settings) that also looks fairly awful to my eye, and seems itself a legacy from the low-DPI days.

      2 replies →