← Back to context

Comment by _aavaa_

3 days ago

I'd wager that a 2021 MacBook, like the one I have, is stronger than the laptop used by majority of people in the world.

Life on an entry or even mid level windows laptop is a very different world.

Yep. Developers make programs run well enough on the hardware sitting on our desks. So long as we’re well paid (and have decent computers ourselves), we have no idea what the average computing experience is for people still running 10yo computers which were slow even for the day. And that keeps the treadmill going. We make everyone need to upgrade every few years.

A few years ago I accidentally left my laptop at work on a Friday afternoon. Instead of going into the office, I pulled out a first generation raspberry pi and got everything set up on that. Needless to say, our nodejs app started pretty slowly. Not for any good reason - there were a couple modules which pulled in huge amounts of code which we didn’t use anyway. A couple hours work made the whole app start 5x faster and use half the ram. I would never have noticed that was a problem with my snappy desktop.

  • > Yep. Developers make programs run well enough on the hardware sitting on our desks. So long as we’re well paid (and have decent computers ourselves), we have no idea what the average computing experience is for people still running 10yo computers which were slow even for the day. And that keeps the treadmill going. We make everyone need to upgrade every few years.

    Same thing happens with UI & Website design. When the designers and front-end devs all have top-spec MacBooks, with 4k+ displays, they design to look good in that environment.

    Then you ship to the rest of the world which are still for the most part on 16:9 1920x1080 (or god forbid, 1366x768), low spec windows laptops and the UI looks like shit and is borderline unstable.

    Now I don't necessarily think things should be designed for the lowest common denominator, but at the very least we should be taking into consideration that the majority of users probably don't have super high end machines or displays. Even today you can buy a brand new "budget" windows laptop that'll come with 8GB of RAM, and a tiny 1920x1080 display, with poor color reproduction and crazy low brightness - and that's what the majority of people are using, if they are using a computer at all and not a phone or tablet.

I've found so many performance issues at work by booting up a really old laptop or working remotely from another continent. It's pretty straightforward to simulate either poor network conditions or generally low performance hardware, but we just don't generally bother to chase down those issues.

  • Oh yeah, I didn't even touch on devs being used to working on super faster internet.

    If you're on Mac, go install Network Link Conditioner and crank that download an upload speed way down. (Xcode > Open Developer Tools > More Developer Tools... > "Additional Tools for Xcode {Version}").

When I bought my current laptop, it was the cheapest one Costco had with 8 gigs of memory, which was at the time plenty for all but specialized uses. I've since upgraded it to 16, which feels like the current standard for that.

But...why? Why on earth do I need 16 gigs of memory for web browsing and basic application use? I'm not even playing games on this thing. But there was an immediate, massive spike in performance when I upgraded the memory. It's bizarre.

  • Most cheap laptops these days ship with only one stick of RAM, and thus are only operating in single-channel mode. By adding another memory module, you can operate in dual-channel mode which can increase performance a lot. You can see the difference in performance by running a full memory test in single-channel mode vs multi-channel mode with a program like memtest86 or memtest86+ or others.