Comment by spijdar
19 hours ago
I actually think about this a lot, and I could argue both sides of this. On the one hand, you could look at your list of examples as obvious examples of modern innovation/improvement that enrich our lives. On the other, you could take it as a fascetious list that proves the point of GP, as one other commenter apparently already has.
I often think how stupid video call meetings are. Teams video calls are one of the few things that make every computer I own, including my M1 MPB, run the fans at full tilt. I've had my phone give me overheat warnings from showing the tile board of bored faces staring blankly at me. And yeah, honestly, it feels like a solution looking for a problem. I understand that it's not, and that some people are obsessed for various reasons (some more legitimate than others) with recreating the conference room vibe, but still.
And with monitors? This is a far more "spicy" take, but I think 1280x1024 is actually fine. Even 1024x768. Now, I have a 4K monitor at home, so don't get me wrong: I like my high DPI monitor.
But I think past 1024x768, the actual productivity gains from higher resolutions begins to rapidly dwindle. 1920x1080, especially in "small" displays (under 20 inches) can look pretty visually stunning. 4K is definitely nicer, but do we really need it?
I'm not trying to get existential with this, because what do we really "need"? But I think that, objectively, computing is divided into two very broad eras. The first era, ending around the mid 2000s, was marked by year-after-year innovation where 2-4 years brought new features that solved _real problems_, as in, features that gave users new qualitative capabilities. Think 24-bit color vs 8-bit color, or 64-bit vs 32-bit (or even 32-bit vs 16-bit). Having a webcam. Having 5+ hours of battery life on a laptop, with a real backlit AMLCD display. Having more than a few gigabytes of internal storage. Having a generic peripheral bus (USB/firewire). Having PCM audio. Having 3D hardware acceleration...
I'm not prepared to vigorously defend this thesis ;-) but it seems at about 2005-ish, the PC space had reached most of these "core qualitative features". After that, everything became better and faster, quantitatively superior versions of the same thing.
And sometimes yeah, it can feel both like it's all gone to waste on ludicrously inefficient software (Teams...), and sometimes, like modern computing did become a solution in search of a problem, in order to keep selling new hardware and software.
> But I think past 1024x768, the actual productivity gains from higher resolutions begins to rapidly dwindle.
Idk man, I do lile seeing multiple windows at once. Browser, terminal, ...
My only counter point to your resolution argument is that 1440p is where I’m happy because of 2 words: real estate. Also 120hz for sure. Above that meh.
I edit video for a tech startup. High high high volume. I need 2-3 27+”1440p screens to really feel like I’ve got the desktop layout I need. I’m running an NLE (which ideally has 2 monitors on its own but I can live on 1), slack, several browser windows with HubSpot and Trello et al., system monitoring, maybe a DAW or audacity, several drives/file windows opens, a text editor for note taking, a PDF/email window with notes for an edit, terminal, the list goes on.
At home I can’t live without my 3440x1440p WS + 1440p second monitor for gaming and discord + whatever else I’ve got going. It’s ridiculous but one monitor, especially 1080p, is so confining. I had this wonderful 900p gateway I held on to until about 2 years ago. It was basically a tv screen, which was nice but just became unnecessary once I got yet another free 1080p IPS monitor from someone doing spring cleaning. I couldn’t go back. It was so cramped!
This is a bit extreme: but our livestream computer is 3 monitors plus a 4th technically: a 70” TV we use for multiview out of OBS.
I need space lol