← Back to context

Comment by DiabloD3

1 day ago

I can almost see this complaint with poorly designed (ie, common and cheap) polarizers in LCDs and/or non-IPS/IPS-like LCDs at normal DPI.

However, that doesn't really work with OLEDs or MicroLEDs at any DPI, or any HighDPI IPS/IPS-likes.

Also, ultrawides are pretty rare. Multiple monitors have a lot more use, are a lot cheaper per pixel, and back to gaming again, a lot of games simply do not support anything but their native aspect ratio and will blackbox the viewport to prevent bugs and cheaters.

It's not the matter of viewing angles. Distances.

Try calculating distances between eyes to the edges of especially ultrawides, for which curved displays are pretty common. Standard recommended distances between head to display is 50cm(20").

  • Optimal viewing distance for most media (including text found in programs and websites made in the "Vista" and "Retina OSX" era to today, but also all movies and TV shows cut for modern 16:9 displays) is between 30 (SMPTE rec.) and 40 (THX rec.) degrees.

    THX's recommendation is based purely on the viewing angle of the fovae (the inner part of the retina that is "high res"), and trying to optimize full coverage of it (ie, pixels on the screen should not fall outside of the fovae).

    Microsoft, during Vista, and Apple, during the evolution of OSX, both standardized font sizes as a little larger to make all text sizes comfortable at the 30 to 40 degree range. 30 degrees is * 1.6 diagonal size, 40 degrees is * 1.2 diagonal size.

    So, if you have a standard 24" 1080p monitor, that is 28.8 to 38.4 inches, not 20.

    SMPTE and THX did not change recommendations for 4k, as the view angle of visual media (ie, the focal length in movies/TV shows) did not change, and text doubles in (pixel) size (but not apparent size) to accommodate it; ergo, do math as if you're on a 100% DPI display. Ergo, 24" 4k would be the same.

    Also, for completeness sake, 27" 1440p are rare, but a bit more common with gamers, and their math works out to be between 32.4 and 43.2 inches. This is assuming you adjusted your DPI to 133% and/or you're purely focused on movies/TV and games; if you consume only text and stay at 100% and never view media, you may wish 24.3 to 32.4 inches instead.

    • I'm not sure if I understand your thought process. I'm talking about distance errors between center of screen and edges of the screen, not singular distance. Frankly you can just paste this to ChatGPT. It's correct enough for this type of topics.

      > let distance between head to display 20 inches, screen width 20 inches as well, tell me how to calculate distance errors between center of screen to edges of screen

      ^ this yields a figure of 2.36 inches among few kBs of padding data

      > does that mean the eyes need to be refocused when there's distance error of 2.36 inches, ok to go step by step for this

      ^ this yields a figure of 30 um among few kBs of padding data

      > does that mean the eyes need to be refocused when there's distance error of 2.36 inches, ok to go step by step for this

      ^ this yields such elaborate responses as "Estimated DoF is about ±1 to 2 inches around the focus point", "Yes, the 2.36-inch distance error is right at or slightly beyond the typical depth of field of the human eye at 20 inches.", "The effect is subtle, but in precision tasks or long durations, your eyes may notice the strain"

      Which are all more than correct enough. btw the math is mostly just basic Pythagorean theorem so not hard to follow.