Comment by lp4vn
2 years ago
My 80 years old grandmother uses Ubuntu to browse web content and play casual games.
When my aunt bought her a new computer with Windows she couldn't use it properly and complained that she wanted a computer "like the old one". She was equally unable to use Macos in my laptop. Nowadays when people complain about Linux's usability I know that they are normally overstating or talking from prejudice because from a practical point of view no way the usability of any modern distro is significantly behind Windows' or Macos' usability, it's even the opposite I would say.
There is a huge blindspot, I've noticed, where people mistake familiarity for ease of use (and other qualities too). Of course familiar things are easy and comfortable, but this thought doesn't seem to occur to most.
Software usability discussions are particularly prone to this bias.
EDIT: another fun one is internet discussions about metric vs imperial systems, with one side or the other swearing that one is inherently more "intuitive" for a particular use. Due to some extraordinary coincidence it's always the one the writer grew up with...
Yep. This was on clear display when Linus Tech Tips did their Linux challenge a year or two ago. A lot of their complaints were just like "the wallpaper settings button isn't in the same place as Windows", with the base assumption that there was something inherently correct about Windows' choice
And there's the belief that "tech skills" = "knowing where the Windows buttons are". OK, that is true in a limited sense, that "x competency" = "familiarity with x", but the point is you can be a very skilled sysadmin or programmer who still makes "noob" mistakes trying to configure Windows just because you're more familiar with MacOS or Linux
This also extends beyond UI to platform concepts in general. I'll take Unix-or-DOS-like hierarchical filesystems as an example. Even here I've seen people equate knowledge of a hier FS structure with inherent technical ability, when discussing those "teens can't use computers" articles. It's certainly correlated, since all major operating systems do use them, but it's still a mistake to think that there's something inherently correct about choosing that storage model, over say how a mainframe or Multics did things
> And there's the belief that "tech skills" = "knowing where the Windows buttons are". OK, that is true in a limited sense, that "x competency" = "familiarity with x", but the point is you can be a very skilled sysadmin or programmer who still makes "noob" mistakes trying to configure Windows just because you're more familiar with MacOS or Linux
To be fair, the Linux world makes the same mistake, labeling a user as "not tech competent", "needing hand holding" or "afraid of the command line" just because they do not have the time and patience to put up with the amount of bullshit that (insert whatever distro here) throws at them.
1 reply →
> There is a huge blindspot, I've noticed, where people mistake familiarity for ease of use (and other qualities too).
The "legacy" Windows design isn't just so beloved because of familiarity, but because it actually provides visual cues to users [1], and the backstory on how it was designed is also interesting [2].
I know that there was an even more detailed article floating around here on HN but I can't find it offhand.
[1] https://twitter.com/tuomassalo/status/978717292023500805
[2] https://socket3.wordpress.com/2018/02/03/designing-windows-9...
Yeah but come on, the metric system is obviously superior to the imperial.
IMHO usability on Linux is good for advanced users who can more or less understand how packages work and can use the command line to some extent and also for people on the other side who are fine with a 'static' system, use a very limited number of apps, have fixed workflows and don't need to change/install anything themselves.
In between there is a giant pit with hard/impossible to solve cryptic errors (or no errors and just silent crashed on launch unless you try to open the same app in the terminal). Confusing and half-baked documentation (because there are dozens different way to accomplish the same thing depending on your distro and config, good luck figuring out which is the right one for you) etc. etc.
A lot of these issues are not really "bugs" and just a natural outcome of the decentralized nature of Linux (non-kernel part) development. They can be solved by power users but not by people who are used to much more user friendly workflows on macOS (again IHMO) somewhat less (but still more so than on Linux) on Windows.
I install everywhere Fedora for the same reasons. As long an the users aren’t Windows-Users and believe they know computers they will be happy with a plain Linux. They update itself, they upgrade it self and they’re happy not forced into updates. GNOME can be criticized for missing options but it features a simple and neat interface and keyboard centric usage makes it a bless.
I myself using Arch which for more than 14 years now and it is a perfect fit for professions and enthusiasts. But in case of an average user, Fedora.
What I mean with Windows-Users? People which believe they need to install “drivers“ themselves. Which argue against Linux anyway because it doesn’t support the weird „Desktop Metapher“ from Windows 95. And usually argue that weird hardware like 3D-Shutterglasses or some kind of HDR-Something (just insert here some hot new stuff) isn’t supported. The broad majority of users don’t want that and don’t need it. What matters is HiDPI-Scaling (good, with exception of the awful thing named Electron) and easy to configure sound-system (Pipewire nailed it). And unification of which we achieved through Linux, LIBC/LIBSTDC++, Coretuils and finally Systemd and Flatpak. The point here is the chain of parts building upon each other.
Recommendations Stay away from Nvidia. Use old ThinkPads if you have not special requirements. Use printers with AirPrint (IPP-Everywhere).
Teaching I would be happy if people start teaching the users to read the interface (like a book or an info grahic), think and then act. Input, Process, Output. TUIs foster that and I think that is why users accustomed to them love them - and dislike most GUIs and nearly all websites.
What computer courses do for decades? Not teaching users using the interface. They just drill them to click on a specific icon (once, or twice or with the wrong mouse button). Just see them happy when the type “Email” and Linux offers Evolution or Thunderbird. And if they don’t find it two months later? Again “Email”.
> Recommendations Stay away from Nvidia
Unless you need to use the GPU for actual work and not gaming then you need CUDA etc.
Nvidia seems fine though as long as you use the installer they provide (as the other comment suggested) for some reason all(?) distro developers can't come up with functional workflows for installing it any other way.
> People which believe they need to install “drivers“ themselves
Do you think that's something people need to do often these days on Windows 10/11?
> support the weird „Desktop Metapher“
What's weird is/was Gnome trying to appeal to mobile users for no reason (at least MS had some justification for the Windows 8). Of course there is KDE if you need a normal desktop.
> with exception of the awful thing named Electron
Perhaps it's awful. But it's something a lot of potential non-power users actually need and care about (unlike LIBC/LIBSTDC++ or Coretuils)
> Use old ThinkPads if you have not special requirements
Like a semi-decent screen? Also why bring up HiDPI-Scaling then?
> teaching the users to read the interface (like a book or an info grahic),
It would be nice if GUI app interfaces were at least semi-consistent on most Linux apps (of course Windows is also terrible at this and Apple are the only ones who managed to get it right).
Honestly, I don't see Linux progressing that much as long as mindsets like this (blaming the users for not using their computers in the right way and telling them what they "actually" need) remain widespread.
For Nvidia users, just install the drivers via the runfile provided by Nvidia. Never had an issue this way.
Granted I install dkms and do an autohook to install the modules on kernel updates, which I wager is a bit much for most.
This exactly the opposite of my recommendation.
I want users are able to execute upgrades and don’t have to care about “details” like the bad/missing support of Wayland in the past by Nvidia, modified APIs or restrictions imposed by EXPORT_SYMBOL_GPL. And a game developer probably wants ensure compatibility only with Mesa.
German article about how Nvidia accepted their situation and how the code will be built into Linux: https://www.heise.de/news/Linux-Kernel-Entwickler-druecken-f...
Basically - Linux won. It is not a perfect victory because a lot of code goes now into firmware which creates another set of issues. And it will require long time to get on par with AMD or Intel. Linux and GNU won because they remained stubborn and the consumers and industry supported that. Nobody wants a PlayStation or SteamDeck with closed source modules. Neither machine industry or car manufacturers. Yes, benchmarks attract customers in short term - but in long term it must be reliable for years and decades. Imagination Technologies and ARM recently also changed their minds. It is sad that this all could have happened 15 years ago. Maybe people learn?
I setup several folks over the years with Linux on their laptops. Usually after they ended up with malware or other oddities on their Windows computers multiple times. I would always ask them what they do on their computers, usually it was 99% web. I would never hear from these people until they wanted to buy a new computer. Prior to that I felt like I was looking at some issue every 6 months or so. All the Linux desktops are pretty nice now, and actually have been for a quite a while if all the person is doing is web stuff. Firefox or Chrome looks the same to the average person no matter what operating system they are using.
The other thing worth mentioning is well, the computers will almost always perform better with a Linux distribution over a bloated Windows install.
> The other thing worth mentioning is well, the computers will almost always perform better with a Linux distribution over a bloated Windows install.
Poor battery life clearly indicates it's the other way around (yes, ram usage might be worse but who cares it's cheap) also removing all the bloatware shipped by the OEM on Windows is not that harder than installing a Linux distro.
I've used Linux primarily since the 90s. I even had a Slackware install back in the day.
When Ubuntu started getting popular I had this same opinion. A lot of my friends would tell me that they gave Linux a try but gave up because of "random things not working" whereas, allegedly, using Windows everything "just works." This didn't sit well with me because I had the opposite experience with Windows.
Until the last couple of years, unfortunately. It's almost always kernel updates, but I had the sound suddenly stop working on a Kubuntu install after a kernel update, and I've had a few cases where the new kernel wouldn't even boot and I had to drop back to the old one.
These types of things are hard to quantify. Maybe there's more random nonsense across all users with Windows than there are with Linux. But kernel updates have started to make me nervous again and that's a step in the wrong direction.