← Back to context

Comment by stephenr

14 hours ago

Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.

The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.

Yeah sure, as long as you have a lot of resources for testing widely.

Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.

  • > if you were to make an analogy you should target for a few devices that represent the "average"

    For Macs, 220DPI absolutely is the average.

    • Sure, but Macs are around 10% of general desktop computing. To a first approximation, they don't count. User communities vary widely. If you target macs, then a high DPI screen is a must for testing. Otherwise, I dunno; ~ 100 DPI screens are way less expensive than ~ 200 DPI screens, so I'd expect that installed base is significantly higher for standard DPI. But there's probably enough high DPI users that it's worth giving it a look.

      To address a question elsewhere, personally, I don't see the benefit to pushing 4x the pixels when ~ 100 DPI works fine for me. My eyes aren't what they were 20 years ago, and it's just extra expense at every level.

    • I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.

      One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.

      11 replies →

I can’t tell you how often I see this. Brand new designs or logos in 2024 or 2025 that look abysmal on a retina monitor because no one bothered to check.

Stands out like a sore thumb.