Comment by crazygringo
2 days ago
Only on standard resolution displays. And it's not even "critical" then, it's just a nice-to-have.
But the world has increasingly moved to Retina-type displays, and there's very little reason for subpixel rendering there.
Plus it just has so many headaches, like screenshots get tied to one subpixel layout, you can't scale bitmaps, etc.
It was a temporary innovation for the LCD era between CRT and Retina, but at this point it's backwards-looking. There's a good reason Apple removed it from macOS years ago.
Even on standard resolution displays with standard subpixel layout, I see color fringing with subpixel rendering. I don't actually have hidpi displays anywhere but my phone, but I still don't want subpixel text rendering. People act like it's a panacea, but honestly the history of how we ended up with it is pretty specific and kind of weird.
> ...I see color fringing with subpixel rendering.
Have you tried adjusting your display gamma for each RGB subchannel? Subpixel antialiasing relies on accurate color space information, even more than other types of anti-aliased rendering.
> the world has increasingly moved to Retina-type displays
Not my world. Even the display hooked up to the crispy work MacBook is still 1080p (which looks really funky on macOS for some reason).
Even in tech circles, almost everyone I know still has a 1080p laptop. Maybe some funky 1200p resolution to make the screen a bit bigger, but the world is not as retina as you may think it is.
For some reason, there's actually quite a price jump from 1080p to 4k unless you're buying a television. I know the panels are more expensive, but I doubt the manufacturer is indeed paying twice the price for them.
My desktop monitor is a 47” display … also running at 4k. It’s essentially a TV, adapted into a computer monitor. It takes up the whole width of my desk.
It’s an utterly glorious display for programming. I can have 3 full width columns of code side by side. Or 2 columns and a terminal window.
But the pixels are still the “normal” size. Text looks noticeably sharper with sub-pixel rendering. I get that subpixel rendering is complex and difficult to implement correctly, but it’s good tech. It’s still much cheaper to have a low resolution display with subpixel font rendering than render 4x as many pixels. To get the same clean text rendering at this size, I’d need an 8k display. Not only would that cost way more money, but rendering an 8k image would bring just about any computer to its knees.
It’s too early to kill sub pixel font rendering. It’s good. We still need it.
Reading this message on a 4k (3840x2160 UHD) monitor I bought ten (10) years ago for $250usd.
Still bemoaning the loss of the basically impossible (50"? I can't remember precisely) 4k TV we bought that same year for $800usd when every other 4k model that existed at the time was $3.3k and up.
It's black point was "when rendering a black frame the set 100% appears to be unpowered" and the whitepoint was "congratulations, this is what it looks like to stare into baseball stadium floodlights". We kept it at 10% brightness as a matter of course and still playing arbitrary content obviated the need for any other form of lighting in our living room and dining room combined at night.
It was too pure for this world and got destroyed by one of the kids throwing something about in the living room. :(
MacOS looks garbage on non-retina displays largely because they don't do any sub pixel AA for text.
AFAIK MacOS looks garbage on standard resolution displays mainly because they don't do any grid-fitting.
I (on Linux) do not use any sub-pixel AA (just regular AA), but i use aggressive grid-fitting and i have nice, sharp text.
Because apple controls all their hardware and can assume that everyone has a particulr set of features and not care about those without. The rest of the industry doesn't have that luxury.
Apple could easily have ensured screens across their whole ecosystem had a specific subpixel alignment - yet they still nixed the feature.
The artifacts created by subpixel AA are dumb and unnecessary when the pixel density is high enough for grayscale to look good. Plus, with display scaling, subpixel AA creates artifacts. (Not like display scaling itself doesn't also create artifacts - I cannot tolerate the scaling artifacts on iPad, for example)
1 reply →
But the world has done nothing of the sorts: what's your assessment of what % of *all* used displays are of retina type?
The funny thing is that in some ways it's true. Modern phones are all retina (because even 1080p at such a resolution is indistinguishable from pixelless). Tablets, even cheap ones, have impressive screen resolutions. I think the highest tea device I own may be my Galaxy Tab S7 FE at 1600x2500.
Computers, on the other hand, have stuck with 1080p, unless you're spending a fortune.
I can only attribute it to penny pinching by the large computer manufacturers, because with the high-res tablets coming to market for Chromebook prices, I doubt they're unable to put a similarly high-res display in a similarly sized laptop without bumping the price up by 500 euros like I've seen them do.
> like screenshots get tied to one subpixel layout
we could do with a better image format for screenshots - something that preserves vectors and text instead of rasterizing. HDR screenshots on Windows are busted for similar reasons.