Comment by vecplane
2 days ago
Subpixel font rendering is critical for readability but, as the author points out, it's a tragedy that we can't get pixel layout specs from the existing display standards.
2 days ago
Subpixel font rendering is critical for readability but, as the author points out, it's a tragedy that we can't get pixel layout specs from the existing display standards.
Only on standard resolution displays. And it's not even "critical" then, it's just a nice-to-have.
But the world has increasingly moved to Retina-type displays, and there's very little reason for subpixel rendering there.
Plus it just has so many headaches, like screenshots get tied to one subpixel layout, you can't scale bitmaps, etc.
It was a temporary innovation for the LCD era between CRT and Retina, but at this point it's backwards-looking. There's a good reason Apple removed it from macOS years ago.
Even on standard resolution displays with standard subpixel layout, I see color fringing with subpixel rendering. I don't actually have hidpi displays anywhere but my phone, but I still don't want subpixel text rendering. People act like it's a panacea, but honestly the history of how we ended up with it is pretty specific and kind of weird.
> ...I see color fringing with subpixel rendering.
Have you tried adjusting your display gamma for each RGB subchannel? Subpixel antialiasing relies on accurate color space information, even more than other types of anti-aliased rendering.
> the world has increasingly moved to Retina-type displays
Not my world. Even the display hooked up to the crispy work MacBook is still 1080p (which looks really funky on macOS for some reason).
Even in tech circles, almost everyone I know still has a 1080p laptop. Maybe some funky 1200p resolution to make the screen a bit bigger, but the world is not as retina as you may think it is.
For some reason, there's actually quite a price jump from 1080p to 4k unless you're buying a television. I know the panels are more expensive, but I doubt the manufacturer is indeed paying twice the price for them.
My desktop monitor is a 47” display … also running at 4k. It’s essentially a TV, adapted into a computer monitor. It takes up the whole width of my desk.
It’s an utterly glorious display for programming. I can have 3 full width columns of code side by side. Or 2 columns and a terminal window.
But the pixels are still the “normal” size. Text looks noticeably sharper with sub-pixel rendering. I get that subpixel rendering is complex and difficult to implement correctly, but it’s good tech. It’s still much cheaper to have a low resolution display with subpixel font rendering than render 4x as many pixels. To get the same clean text rendering at this size, I’d need an 8k display. Not only would that cost way more money, but rendering an 8k image would bring just about any computer to its knees.
It’s too early to kill sub pixel font rendering. It’s good. We still need it.
Reading this message on a 4k (3840x2160 UHD) monitor I bought ten (10) years ago for $250usd.
Still bemoaning the loss of the basically impossible (50"? I can't remember precisely) 4k TV we bought that same year for $800usd when every other 4k model that existed at the time was $3.3k and up.
It's black point was "when rendering a black frame the set 100% appears to be unpowered" and the whitepoint was "congratulations, this is what it looks like to stare into baseball stadium floodlights". We kept it at 10% brightness as a matter of course and still playing arbitrary content obviated the need for any other form of lighting in our living room and dining room combined at night.
It was too pure for this world and got destroyed by one of the kids throwing something about in the living room. :(
MacOS looks garbage on non-retina displays largely because they don't do any sub pixel AA for text.
1 reply →
Because apple controls all their hardware and can assume that everyone has a particulr set of features and not care about those without. The rest of the industry doesn't have that luxury.
Apple could easily have ensured screens across their whole ecosystem had a specific subpixel alignment - yet they still nixed the feature.
2 replies →
But the world has done nothing of the sorts: what's your assessment of what % of *all* used displays are of retina type?
The funny thing is that in some ways it's true. Modern phones are all retina (because even 1080p at such a resolution is indistinguishable from pixelless). Tablets, even cheap ones, have impressive screen resolutions. I think the highest tea device I own may be my Galaxy Tab S7 FE at 1600x2500.
Computers, on the other hand, have stuck with 1080p, unless you're spending a fortune.
I can only attribute it to penny pinching by the large computer manufacturers, because with the high-res tablets coming to market for Chromebook prices, I doubt they're unable to put a similarly high-res display in a similarly sized laptop without bumping the price up by 500 euros like I've seen them do.
> like screenshots get tied to one subpixel layout
we could do with a better image format for screenshots - something that preserves vectors and text instead of rasterizing. HDR screenshots on Windows are busted for similar reasons.
It looks like the DisplayID standard (the modern successor to EDID) is at least intended to allow for this, per https://en.wikipedia.org/wiki/DisplayID#0x0C_Display_device_... . Do display manufacturers not implement this? Either way, it's information that could be easily derived and stored in a hardware-info database, at least for the most common display models.
I don't think any OS exposes an API for this. There's a Linux tool I sometimes use to control the brightness of my screen that works by basically talking directly to the hardware over the GPU.
Unfortunately, EDID isn't always reliable, either: you need to know the screen's orientation as well or rotated screens are going to look awful. You're probably going to need administrator access on computers to even access the hardware to get the necessary data, which can also be a problem for security and ease-of-use reasons.
Plus, some vendors just seem to lie in the EDID. Like with other information tables (ACPI comes to mind), it looks almost like they just copy the config from another product and adjust whatever metadata they remember to update before shipping.
I don't understand why, this has been a thing for decades :(. The article is excellent and links to this "subpixel zoo" highlighting the variety: https://geometrian.com/resources/subpixelzoo/
“Tragedy” is a bit overstating it. Each OS could provide the equivalent of Window’s former ClearType tuner for that purpose, and remember the results per screen or monitor model. You’d also want that in the inevitable case where monitors report the wrong layout.
Subpixel rendering isn't necessary in most languages. Bitmap fonts or hinted vector fonts without antialiasing give excellent readability. Only if the language uses characters with very intricate details such as Chinese or Japanese is subpixel rendering important.
Ah so only 20% of the global population? Nbd