← Back to context

Comment by badc0ffee

7 hours ago

I would love to see examples of this. I have a MBP and a 24" 4K Dell monitor connected via HDMI. I use all kinds of scaled resolutions and I've never noticed anything being jagged or blurry.

Meanwhile in Linux the scaling is generally good, but occasionally I'll run into some UI element that doesn't scale properly, or some application that has a tiny mouse cursor.

And then Windows has serious problems with old apps - blurry as hell with a high DPI display.

Subpixel antialiasing isn't something I miss on macOS because it seems pointless at these resolutions [0]. And I don't think it would work with OLED anyway because the subpixels are arranged differently than a typical conventional LCD.

[0] I remember being excited by ClearType on Windows back in the day, and I did notice a difference. But there's no way I'd be able to discern it on a high DPI display; the conventional antialiasing macOS does is enough.

I'm more surprised that you're using a 24" display at any resolution. Of course, everyone has different preferences, but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably.

I'm personally on the old 30" 16:10 2560x1600 form factor, and it's wildly better visually than the 27" 1440p screen by the same brand (all of them Dell) I use at the office.

I have a Macbook pro and a Linux machine attached to my dual 4k monitors.

Fonts on Linux (KDE Plasma on Wayland) look noticeably sharper than the Mac. I don't use subpixel rendering either. I hate that I have to use the Mac for work.