← Back to context

Comment by piker

14 hours ago

The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off. For example, it's clear that a lot of the Rust UI framework developers have been working on Macs for the last few years. The font rendering on many of those look bad once you plug them into a more normal DPI monitor. If they hadn't been using Macs with Retina displays they would have noticed.

This is more widespread than we like to admit.

Developers writing software on 64GB M4 Macs often don't realize the performance bottlenecks of the software they write.

Developers working over 1gbps Internet connections often don't realize the data gluttony of the software they write.

Developers writing services over unlimited cloud budgets often don't realize the resource wastes into which their software incurrs.

And to extend this to society in general.

Rich people with nice things often alienate themselves from the reality of the majority of people in the World.

  • You can nerf network performance in the browser devtools or underprovision a VM relatively easily on these machines. People sometimes choose not to and others are ignorant. Most of the time, it's just the case that they are dealing with too many things that are vague making it difficult to prioritize seemingly less important things.

    A number of times I've had to have a framing discussion with a dev that eventually gets to me asking "what kind of computer do your (grand)parents use? How might X perform there" around some customer complaint. Other times, I've heard devs comment negatively after the holidays when they've tried their product on a family computer.

    • > Other times, I've heard devs comment negatively after the holidays when they've tried their product on a family computer.

      I worked for a popular company and went to visit family during the winter holidays. I couldn't believe how many commercials there were for said company's hot consumer product (I haven't had cable or over-air television since well before streaming was a thing, so this was a new experience in the previous five years).

      I concluded that if I had cable and didn't work for the company, I'd hate them due to the bajillion loud ads. My family didn't seem to notice. They tuned out all the commercials, as did a friend when I was at his place around a similar time

      All it takes is a change in perspective to see something in an entirely new light.

      1 reply →

    • More to the point; colour and font rendering are typically "perception" questions and very hard to measure in a deployed system without introducing a significant out of band element.

      Network performance can be trivially measured in your users; and most latency/performance/bandwidth issues can be identified clearly.

    • Chrome devtools allow you to simulate low network and CPU performance, but I'm not aware of any setting that gives you pixelated text and washed-out colors. Maybe that will make a useful plugin, if you can accurately reproduce what Microsoft ClearType does at 96dpi!

  • > Developers working over 1gbps Internet connections often don't realize the data gluttony of the software they write.

    As a developer and AirBnB owner, what I’ve also noticed is the gluttony of the toolchain as well. I’ve had complaints about a 500/30 connection from remote working devs (very clear from the details they give) which is the fastest you can get for much of the metro I am in.

    At home I can get up to 5/5 on fiber because we’re in a special permitting corridor and AT&T can basically do whatever they want with their fiber using an on old discontinued sewer run as their conduit.

    I stick to the 1/1 and get 1.25 for “free” since we’re so over-provisioned. The fastest Xfinity provides in the same area as my AirBnB is an unreliable 230/20 which means my “free” excess bandwidth is higher than what many people near me can pay for.

    I expect as a result of all this, developers on very fast connections end up having enough layers of corporate VPN, poorly optimized pipelines, a lot of dependency on external servers, etc that by the time you’re connected to work your 1/1 connection is about 300/300 (at least mine is) so the expectation is silently set that very fast internet will exist for on-Corp survival and that the off-corp experience is what others have.

    • OT, but leaving the zeros on those gigabit numbers makes this a lot less work to understand, at first I thought maybe you were in mbps throughout.

    • Not only bandwidth but also latency can vary dramatically depending on where you are. Some of your guests might have been trying to connect to a VPN that tunnels all their traffic halfway around the world. That's much, much worse than getting a few hundred Mbps less bandwidth.

      1 reply →

  • I wish we could have this as a permanent sticky for this website. It's out of control, especially with web stuff.

    Spotify's webapp, for example, won't even work on my old computer, whereas YouTube and other things that you'd think would be more resource intensive work without any issue whatsoever.

  • I tend to use older hardware and feel like I’m constantly fighting this battle. It’s amazing thr hardware we have and I have to wait for dozens of seconds to start an app or load a web page.

    • Sometimes I run "old software" on the latest hardware it could support (think Windows 2000 on 2010s machines) and it is amazing how much it flies.

    • I would like to ask why even fight the battle?

      Philosophically I am with you, e-waste and consumerism are bad, but pragmatically it is not worth punishing yourself from a dollars and cents standpoint.

      You can jump on Amazon and buy a mini PC in the $300 range that’s got an 8 core 16 thread AMD 6800H CPU, 16GB RAM, 500GB SSD, basically a well above-average machine, with upgradable RAM and storage. $240 if you buy it on AliExpress.

      You can grab a MacBook Air M2 for around $500.

      Why suffer with slow hardware? Assuming that using a computer is at least somewhat important to your workflow.

  • At a "rich world" company that wants to make money, it's completely rational to not give a shit about "poor world" people that won't make you much money (relatively speaking) anyways. It basically only makes sense to milk the top leg of the K-shaped economy.

    Conversely, it opens up a niche for "poor world" people to develop local solutions for local challenges, like mobile payments in India and some of Africa.

  • You need two "classes" of developers; which may be the exact same people - those who are on the fastest, biggest hardware money can buy - but you also need some time running on nearly the worst hardware you can find.

  • I agree, but developers don't have freedom over the product. Product managers are the ones who have a say, and even then, they are in a strict hierarchy, often ending at "shareholders". So, many of the wrongs come from the system itself. It's either systemic change (at least an upgrade), or no meaningful change.

> The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off.

That’s not the problem of using this monitor for creating the work, that’s the problem of not also using a more typical monitor (or, better, an array covering the common use cases, but which is practical depends on whether you are talking about a solo creator or a bigger team) for validating the work.

Just as with software, developers benefit from a more powerful machine for developing, but the product benefits from also being tested on machines more like the typical end-user setup.

Yes! I’m glad to see this pointed out - when working on UIs, I regularly move them between 3 monitors with varying resolution & DPI. 4k @ 200%, 2K at 125%, and 2K at 100%. This reveals not only design issues but application stack issues with DPI support.

As a designer, one should keep a couple of cheap, low-res monitors reset to the factory defaults for proofing what many users are going to see.

  • This is probably one of the few things I think works better in an office environment. There was older equipment hanging around with space to set it up in a corner so people could sit down and just go. When mobile came along there would be sustainable lending program for devices.

    With more people being remote, this either doesn't happen, or is much more limited. Support teams have to repro issues or walk through scenarios across web, iOS, and Android. Sometimes they only have their own device. Better places will have some kind of program to get them refurb devices. Most times though people have to move the customer to someone who has an iPhone or whatever.

  • I must confess I felt a lot of lust looking at the self color calibration feature.

    It is extremely useful if your work ends up in paper. For photography (edit: film and broadcast, too) would be great.

    My use case are comics and illustration, so a self-color-correcting cintiq or tablet would be great for me.

    • I like having a color calibrated monitor but at the end of the day it’s about trusting my scopes too. Audio unfortunately has this perception element that for some reason doesn’t seem as big of an issue with video. We have dB/loudness standards for a reason, but different stuff just sounds louder or softer no matter what.

      If it looks good on a mac laptop screen/imac and the scopes look right, it’s good for 99%+ of viewers. You can basically just edit visually off any Mac laptop from the last 10 years and you’ll probably be happy tbh.

      1 reply →

  • this exactly. same ppl do for sound, listen in the car, over shity headphones etc. - that's just quality control not the fault of any piece of equipment.

    • Yes this is universal in pro mixing setups, having filters or even actual physical hardware to provide the sound of stock earbuds, a crappy Bluetooth speaker, sound system in a minivan, etc.

    • Well, of course it's a good idea to double check with various output methods. But if a mix sounds good on studio monitors with a flattest possible frequency response (preferably even calibrated with an internal DSP) in an acoustically treated room, there's a very high probability it will sound good on almost anything out there. At least that's my experience.

      1 reply →

I make a point of keeping my secondary monitor a "normal" DPI 2560x1440 display precisely to avoid this kind of problem. The loss of legibility has little impact on secondary monitor use cases, and I can easily spot-check my UI and UI graphics work by simply dragging the window over.

High quality normal DPI monitors are so cheap these days that even if multi-monitor isn't one's cup of tea there's not really a good reason to not have one (except maybe space restrictions, in which case a cheap ~16" 1080p/1200p portable monitor from Amazon will serve the purpose nicely).

Also, it's not only about the screen resolution. Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

Had reported many issues where to reproduce they needed to enable 10x throttling in the browser. Or use a Windows machine.

  • > Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

    Part of what QA testing should be about: performance regressions.

    • Usually it is not a priority, especially for enterprise software. It's ok if UI is lagging, page finish loading in 10 seconds, etc. Simply because there is usually no other choice, people have to use whatever is bought/developed.

This is exactly how sound studios do mixing. They don't just use top-end monitors -- they generally also listen on low-end speakers that color sound in a way that's representative to what people have at home (hello, Yamaha NS-10).

  • People used to buy NS-10s because they knew professional studios used them. They were then underwhelmed when they sounded worse than the hifi speakers they had at home.

    Many audio engineers live by the mantra "if it sounds good on NS-10s, it'll sound good on anything".

    We need such a touchstone for software engineers.

    • It'd be moving touchstone is the problems, speakers in the consumer space don't evolve as fast as computing tech in the user space.

      You could get somewhat close by looking at what was a middle of the road consumer laptop from Dell/HP/Lenovo 5 years ago and buying one of those though.

Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.

The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.

  • Yeah sure, as long as you have a lot of resources for testing widely.

    Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.

  • I can’t tell you how often I see this. Brand new designs or logos in 2024 or 2025 that look abysmal on a retina monitor because no one bothered to check.

    Stands out like a sore thumb.

This is just as valid for mobile app and website development.

When all you use for testing is Browserstack, local emulators and whatnot and only the latest iPhone and Samsung S-series flagship, your Thing will be unusable for wide parts of the population.

Always, always use at the very least the oldest iPhone Apple still supports, the cheapest and oldest (!) Samsung A-series models still being sold in retail stores as "new", and at least one Huawei and Xiaomi device. And then, don't test your Thing only on wifi backed by your Gbit Wifi 7 router and uplink. Disable wifi and limit mobile data to 2G or whatever is the lowest your phone provider supports.

And then, have someone from QA visit the countryside with long stretches of no service at all or serious degradation (think packet loss rates of 60% or more, latencies of 2 seconds+). If your app survives this with minimal loss of functionality, you did good.

A bunch of issues will only crop up in real world testing. Stuff like instead of keeping a single socket to the mothership open, using fresh from scratch SSL connections for each interactions is the main bummer... latency really really eats such bottlenecks alive. Forgotten async handling leading to non-responsiveness of the main application. You won't catch that, not even with Chrome's network inspector - you won't feel the sheer rage of the end user having a pressing need and be let down by your Thing - even if you're not responsible for their shitty phone service, they will associate the bad service with your app.

Oh, and also test out getting interrupted while using your Thing on the cheap-ass phones. Whatsapp and FB Messenger calls, for example - these gobble so much RAM that your app or browser will get killed by OOM or battery saver, and when the user has their interruption finished, if you didn't do it right your Thing's local state will have gotten corrupted or removed, leading the user having to start from scratch!

>The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off.

Really? It's not a problem for photo retouchers, for whom a monitor like this is basically designed for.