← Back to context

Comment by pjc50

1 day ago

Operating systems of that era were designed based on UX research to help people use the unfamiliar operating system.

Subsequent ones were designed by UI designers, and opinionated senior managers, who already knew how to use them, and took out usability features to make them "look nicer". This sort of worked when the opinionated manager was Steve Jobs. Most managers are not Steve Jobs.

> in some applications they seem to have taken extra steps to make it difficult to find the line to grab

Pet peeve of mine in Windows where the line is at most one pixel now. They also took away the coloured distinction between title bars for the active window, so you don't know where keystrokes are going to go.

> Operating systems of that era were designed based on UX research

Too many developers nowadays don't know this. On any HN discussion of UIs, I've been noticing a growing number of younger devs insisting that usability is entirely subjective (their words, not mine). It's not just that they don't know about cleverly thought-out things such as safe triangles in nested menus or all the affordances/signifiers espoused by Don Norman et al. The bigger problem is that they don't know what they don't know, and they come across as being unwilling to learn.

It does make UX discussions frustrating and meaningless when they could, and should, be interesting and a learning experience for us all.

  • > Too many developers nowadays don't know this.

    Guess they've never been on the phone with an elderly relative in tears because she can't figure out basic tasks on an iPad anymore after years of learning how.

    That's when you realize you, as a highly-skilled technical person, can't either, because they've moved, hidden, or otherwise obfuscated them.

    Yesterday I learned there are two icons in the Files app called "..."

    Yes, two.

    Incidentally I was looking for how to delete a file, which is now deliberately missing from the object's context menu, and intentionally hidden under one of these.

    • A few weeks ago I was co-hosting a live coding session (in front of a crowd, it was pretty collaborative, back-and-forth).

      I had to authorize something with Firebase, for which I had to auth with Google, for which I had to do a MFA with my (Pixel) phone.

      Usually it's "are you trying to auth" and finger-to-the-scanner, but around that time this particular way didn't work. It also didn't want to send me a text or a call to auth me.

      No, I had to find an OTP code. Easy, right? Wrong. The instructions, and the docs, don't match where it was in that particular version of Android, and there were a bunch of blind alleys that were named basically the same.

      It took me like 10 minutes, on stage, browsing my phone (thankfully, not casted to screen) to find the friggin' option. Thankfully the cohost was doing the presenting at that time, but it was pretty lousy.

      And this is using Google's OS on a Google phone doing a Google auth flow for a Google property. And I'm a techie who's been using Android for 15+ years now. And I did the exact same dance a few weeks before that - also so roundabout I had no idea how I stumbled on the correct page.

      User experience my ass.

      PS. The regular "are you trying to sign in?" flow works again. No idea what happened - wasn't me.

    • Even with screen sharing, I've said "click the three dots" and then "no, not those, not that one, wait there's another one, no that's the wrong one ..."

      8 replies →

  • > safe triangles in nested menus

    I did not know about this, but I did notice my own menu-rage every time a submenu disappears!

    • I was trying to use Orca Slicer (which itself is intractable) and it had a combo button whose menu was disconnected from the button. The menu would disappear as soon as the cursor left the button boundary, but because it was disconnected, there was no way to get to the menu without leaving the button boundary, traveling a void, and then getting to the menu. I’m unsure what incantation allowed me to finally choose the right command, but forget how it looks, it was if no one even tried to see if it works.

      3 replies →

  • There are still UX research. It's just that the collective "we" has changed and we can/may build on some existing design decisions.

    You are always designing something with a target audience in mind, and the next, e.g. mobile phone will very likely be used by someone who has interacted briefly with a similar device, so you may re-use some already learnt patterns.

    The very early UXs built heavily on desktop metaphors (like folders), but at this point many (and an increasing number of) people are more familiar with OS UI n-1 than a typical office setting.

    So I don't think jumping to this conclusion is correct - there are well-designed software, it has just become much much cheaper to create new ones, so the average quality has necessarily went down.

> This sort of worked when the opinionated manager was Steve Jobs.

Steve indirectly had a hand in this, by emphasizing the humanities. That, unfortunately, backfired as a sort of positive feedback loop.

Someone hired a few underemployed artists onto the team, and the artists invited all their friends and soon took over the department.

People that in an alternate timeline would be smoking weed whilst sculpting wood in a derelict loft somewhere are now the lead designers, using our software as the canvas of a perpetual avant-garde art piece.

They also need to look productive to justify their jobs, so the need to change things is constant.

That's why in 2026 you could have a PhD in CS and still need to watch a YouTube video to learn how to change the volume.

Can anyone name a single substantive UI improvement in the last 20 years? They're simply hiding or moving stuff around at this point while no one has even touched accessibility.

  • You are so very spot on with this. All of it. Literally nothing is better in the UI world in the past 20 years. Zero. We already had multitouch scrolling on laptops back then.

    I don’t think it’s a stretch to say that most of the problem can be traced back to the transition to Mobile first design. The motivations were arguably pretty innocent in general. If there were no downsides, it’s nice that there isn’t a separate code base and an entirely separate set of capabilities for desktop and little 5-inch phone screens. However, the way that we have achieved that - nearly across the board - is by lobotomizing the experience everywhere.

    And because of fashion (those artists who control the UX can’t resist it), even in places where that doesn’t even make any sense because there is no mobile version (say, B2B SaaS products that only get used on a desktop), they still feel the need to cosplay as a mobile app by using all the same stupid design elements (the ••• and “hamburger” menus, the giant grids of “tiles” that should have been a table, etc.

    • > And because of fashion...

      That's basically the curse. Fucking fashion. If that human concept wouldn't exist, UIs today would be way way better. But no, we have to keep changing it forever and with each iteration worse and worse. UI enshittification at its pinnacle.

  • > a single substantive UI improvement in the last 20 years?

    On the desktop? No.

    In human-computer interaction? The multitouch UI using a capacitative touchcreen, as used in the iPhone (2007, so 19 years ago) and iPad (2010).

    This redefined how UIs work, so yeah, it's vastly significant.

    The trouble is that now there's a whole generation of developers and desighers who literally grew up with it and its imitations, and they're trying to apply its "simplicity" to desktop WIMP GUIs. In the process they are removing things like, you know, the "M" of WIMP (whether it's "mouse" or "menu") because they don't see it as important.

  • > Can anyone name a single substantive UI improvement in the last 20 years?

    That thing Windows has where you can drag a window to the top of the desktop and it pops up a few quick options for resizing. I would love it if KDE Plasma had this.

  • idk, i think your underestimating the ubiquity and resources behind stuff like A/B and usability testing nowadays. Certainly a much more sophisticated way of determining whether people are able to find what they need.

> Operating systems of that era were designed based on UX research to help people use the unfamiliar operating system.

I have a lot of thoughts on things like PC usability today. You're right that UX research would have heavily contributed to the design on these older systems. As computers moved from the warehouse to the living room they had to be easier to use and understand for people without CS degrees. I think it is fair to assume *some* things about what people these days are familiar with when it comes to the desktop GUI, but usability should receive more focus now even if it slightly hinders aesthetic. A friend of mine has been teaching a college program for video editing and she has students who needed her to explain what files and folders are. This is not the first time I've heard of things like this.

Smartphones and tablets have obfuscated so many basic functions and features that it is actively harming people's understanding of how to use a computer. Things like window sizing, executables, how apps know where things are, and how programs are installed. Android does allow users to peek behind the curtain more than iOS but Google has been going down the path of locking down Android. I haven't been in an elementary school classroom for like 17 years but I remember having computer lab time where we would learn how to use Windows 95/98. I think what has benefited my friends and others my age (~30) is that we grew up when computers were in the home and were usable enough for us to log in and intuit our way around but there was enough friction that made it so we would have to figure things out on our own.

If you haven't tried it already, I've found it useful to get Windows to use the accent colour in the title bar and window borders: https://support.microsoft.com/en-us/windows/personalize-your...

  • Thank you for posting this. I have enabled them at home and work. I am really tired of having to look for a window's shadow to resize it. Which is problematic when you run a black desktop.

    Having to resize a window by grabbing it just outside the visible border is so wrong.

  • Absolutely, but there are many programs that don't use that accent color, making it less useful than it should be.

"Subsequent ones were designed by UI designers, and opinionated senior managers, who already knew how to use them, and took out usability features to make them "look nicer"."

With desktop OS I feel a lot of designers don't know how to use them. They grew up with phones and never use a desktop OS outside of work.

Chesterton's fence! Don't delete something unless you know why it's there in the first place.

> based on UX research to help people use the unfamiliar operating system.

It's worthwhile to note that this was not just research in a vacuum, but a lot of user studies where they literally watched and studied people using the software and how they were confused, found or didn't find functionality, etc. Lots of interviews, talking to people, boiling things down to how actual people struggled with the software.

One example of a UI being a result of research seems to be Windows 98. Much of the surface is gray, and a lot of the text is black. It might look boring, but that is how you get to use a little colour for things that need accent, and it will make a difference. Also in a factory, the walls are gray, but the fire extinguisher is red so that you can hardly miss it.

> opinionated senior managers, who already knew how to use them

The latest design of interfaces is designed by people who have barely used a desktop computer and have no idea of the conventions or advanced usage. They create terrible UIs because they have no idea what a good UI is and they often don't even use the product they create.

‘Took out usability features to make them "look nicer"’ is exactly how Steve Jobs gave us the double-click, undiscoverable and timing-sensitive.

  • Double-click came out of Xerox's research park. Apply might have been the first to put that into a popular desktop PC solution, but it wasn't their design any more than the rest of the system they copied. There are arguments that a second button was a much better idea, but that would still not be immediately discoverable and even with many buttons in modern solutions we _still_ have double-clicking.

    • Minor correction - Xerox knew they could not commercialise their invention so they wanted someone to take it off their hands. So Apple didn't copy - they paid for it (in stock, not cash) - and if you've ever used a Smalltalk environment you'll know that what Apple actually shipped (in the Lisa and then the Mac) is a _lot_ of work done over the top of what Xerox had.

    • It may have been Tajo (XDE) though a quick search doesn't find any documentation of multi-click _older_ than Apple's.

      Star definitely didn't have multi-click.

    • I've always heard it came from Xerox PARC, but even if it originated at Apple, it would have been one of their OS devs. It's nuts how the cult of Steve Jobs leads some to label him as the inventor of everything.

  • And something my older relatives have trouble with to this day, no matter how much I adjust their double-click timing settings...

My pet peeve is spacing. My usual resolution is 1920x1080 (scaled or not) and it feels I could cram more information in an old 1024x768 desktop. You have to maximize most windows to get it to show enough information.

  • This drives me crazy. Even looking at these old screenshots you just know that these systems we outputting a display resolution lower than 1024x768.

    When I was checking out the MacBook Neo a while back I was disappointed that the resolution is not natively x2 scaled. It uses fractional scaling when macOS handles fractional scaling quite poorly. I've set the resolution on my M1 MBP to 1280x800 so it was x2 scaled and clarity improved significantly. But I also sacrificed usable space because apps don't adjust, everything is just made larger.

  • >> You have to maximize most windows to get it to show enough information.

    At work I use 1 or 2 monitors plus the laptop screen (on Windows). At home I just use a single 55" 4K TV for my monitor and place apps center, left, right, and up top for rarely used stuff (on Linux). The desktop metaphor always wanted a big display but you're right - most Windows apps expect a full 1920x1080 for themselves.

  • Same here. The Teams meeting page layout pisses me off on a regular basis, with way too much useless space around everything, tons of unhideable icons and crap filling half the screen, and all the actual content crammed into a little box. I'm sitting here with a 4K 27" monitor and all that space and resolution is just wasted. Yeah you can work around it, but what a PITA.

For the brief time I used Windows 11 the amount of times I placed a window over another and then clicked on the wrong window because I couldn't tell at first glance where one started and the other ended was absolutely ridiculous.

I'm afraid that the core of the problem is something far more simple and fundamental.

The people designing desktop apps today simply never learned the conventions that make desktop applications good. They grew up with smartphone apps, web apps, electron apps, games, etc.

In fact, you can observe from things like JavaFX, Flutter, WPF, etc., that the trend has long been about the ability of easily creating custom widgets like you could with Javascript (or Flash), rather than the convenience of having a library of widgets that look and feel exactly the same as every other widget in the system.

  • "I couldn't tell at first glance where one started and the other ended "

    Sometimes I am starting to feel like how my dad looked many years ago when I tried to teach him how to use Windows. He simply couldn't see the window borders. With the latest designs I am reaching this point too. I am struggling moving and resizing windows because I can't tell where the border is.

  • > couldn't tell at first glance where one started and the other ended

    This was even worse in an RDP session. No drop shadows. I'm not sure who thought "everything should be flat and white" was a good idea.

  • > look and feel exactly the same as every other widget in the system

    Which is what? Windows natively has like 4 official looks. You can click around the 2 (!) settings programs and pop open windows for basically every framework windows has created (and deprecated) in the last 2 decades.