Comment by MBCook

15 hours ago

It amazes me it’s so hard to find monitors around 210+ PPI. Glad this is one.

That said I would be scared to buy this. I’ve heard so many horror stories about the LG UltraFine 5k and the ports breaking and then having to send it in for repair for a long time.

At this point I don’t trust their build quality for monitors.

In general though, I am so glad to see big high DPI monitors have more than one or two options finally.

> It amazes me it’s so hard to find monitors around 210+ PPI.

You're in luck; several 5120 × 2880, 600 mm × 340 mm monitors at high refresh rates were announced at CES a couple weeks ago.

  MSI MPG 271KRAW16
  LG 27GM950B 
  Acer Nitro XV270X
  HKC M9 Pro
  Hisense 27GX-Pro

And many more.

  • I have waited so long for 220dpi and high refresh rate! Love to see it. Will ditch my Apple Studio Display for it.

I've found it much easier to increase your viewing distance for an equivalent effect. All else being equal this provides the additional benefit of reduced eye strain from a decrease in parallax.

For example, I've settled on ~160 PPI viewed at 100cm as my optimal desktop solution. It has an identical perceived pixel density as ~220 PPI viewed at 75cm.

Use a PPD (pixel per degree) calculator to find a setup that suits your needs: https://qasimk.io/screen-ppd/

  • I was forced to do this as my eyes have aged and I can't focus at 5K 27" at a reasonable distance, and can't read the text when I sit far enough back to focus. Hence why 4K 27" (~160 ppi) has become perfect for me.

    Would be nice if Apple supported non-integer scaling so I could just dynamically resize everything (without the current technique and performance hit/blurriness of upscaling then downsizing).

I bought a launch LG UltraFine 5K that was in the batch of defective units but I was too lazy to return it. Somehow, it's held up just fine a decade later; only color bleeding is an issue.

  • I think the main problem with the LG is if you charge your laptop from it. Doing that heats up the connector and pulls it from the main board.

I am still using an LG UltraFine 5k since launch. I experienced flickering in the first month and had the monitor replaced by supplier - and it's been amazing ever since! Also, this DPI is perfect for having both crisp text and correct sized elements on screen (in MacOS).

32" 6K is very tempting!

For a long time, the only very high (>200) DPI monitors on the market were Apple's first-party ones and the LG UltraFine, the former being stupidly overpriced and the latter having, as you say, reliability horror stories. I assume the dearth of other options was because macOS doesn't do fractional scaling, only 2x, so only Apple users really needed 5K-at-27" or 6K-at-32" whereas Windows/Linux users can be ok with 150% scaling.

But that's finally changing: several high-DPI monitors came out last year, and even more are coming this year, which should force manufacturers to do better re: both price and reliability. Last year I got a pair of the Asus ProArt 5K monitors, plus a CalDigit Thunderbolt hub, and have been very happy with this setup.

  • > I assume the dearth of other options was because macOS doesn't do fractional scaling

    Except it does? I have a 14" MBP with a 3024x1964 display. By default, it uses a doubling for an effective 1512x982, but I can also select 1800x1169, 1352x878, 1147x745, or 1024x665. So it certainly does have fractional scaling options.

    If you connect a 4k 2160p monitor, you can go down or up from the default 1080p doubling (https://www.howtogeek.com/why-your-mac-shows-the-wrong-resol...). If you select 2560x1440 for a 4k 2160p screen, that's 150% scaling rather than 2x (https://appleinsider.com/inside/macos/tips/what-is-display-s..., see the image where it compares "native 2x scaling" to "appears like 2560x1440").

    • macOS fakes fractional scaling by rendering a larger image at 2x and then downscaling it. For example, 1800x1169 renders a 3600x2338 at 2x scaling, then resizes the rendered image to 3024x1964. This is slower and looks worse than true fractional scaling would be, but makes the implementation a lot easier and in practice it’s hard to tell the difference. It’d look pretty awful if the native ppi wasn’t so high.

    • I believe it was 2x only early on. But as you said it’s fractional now and has been for a longtime.

      The instant Apple wanted to use a panel that wasn’t 2x, the feature appeared.

      3 replies →

  • As a Linux user, I am confused when I hear other people talking about "scaling" and even more when they talk about being able to use only a restricted set of values for "scaling".

    For much more than a decade, I have not used any monitor with a resolution less than 4k with Linux. I have never used any kind of "scaling" and I would not want to use any kind of "scaling", because that by definition means a lower image quality than it should be.

    In X Window System, and in any other decent graphic interface system, the sizes of graphic elements, e.g. the size of fonts or of document pages, should be specified in length units, e.g. typographic points, millimeters or inches.

    The graphic system knows the dots-per-inch value of the monitor (using either a value configured by the user or the value read from the monitor EDID when the monitor is initialized). When the graphic elements, such as letters are rasterized, the algorithm uses the dimensions in length units and the DPI value to generate the corresponding bitmap.

    "Scaling" normally refers to the scaling of a bitmap into another bitmap with a greater resolution, which can be done either by pixel interpolation or by pixel duplication. This is the wrong place for increasing the size of an image that has been generated by the rasterization of fonts and of vector graphics. The right place for dimension control is during the rasterization process, because only there this can be done without image quality loss.

    Thus there should be no "scaling", one should just take care that the monitor DPI is configured correctly, in which case the size of the graphic elements on the screen will be independent of the resolution of the connected monitor. Using a monitor with a higher resolution must result in more beautiful letters, not in smaller letters.

    Windows got this wrong, with its scaling factor for fonts, but at least in Linux XFCE this is done right, so I can set whatever DPI value I want, e.g. 137 dpi, 179 dpi, or any other value.

    If you configure the exact DPI value of your monitor, then the dimensions of a text or picture on the screen will be equal to those of the same text or picture when printed on paper.

    One may want to have a bigger text on screen than on paper, because you normally stay at a greater distance from the monitor than the distance at which you would hold a sheet of paper or a book in your hand.

    For this, you must set a bigger DPI value than the real one, so that the rasterizer will believe that your screen is smaller and it will draw bigger letters to compensate for that.

    For instance, I set 216 dpi for a Dell 27 inch 4k monitor, which will magnify the images on screen by about 4/3 in comparison with their printed size. This has nothing to do with a "scaling". The rasterizer just uses the 216 dpi value, for example when rasterizing a 12 point font, in such a way that the computed bitmap will have the desired size, which is greater than its printed size by the factor chosen by me.

    • It's probably called scaling because that's what other OSes do.

      For example macOS just renders at 200% and then scales down to the desired level.

      Linux is indeed way better at this.

      2 replies →

    • Doesn't this depend on the application. For example electron applications dgaf about this system, render to a bitmap, and then look terrible as a result.