Comment by olivierestsage
7 days ago
It really has gotten to the point where Linux offers the best option for a sane desktop experience. Watching Windows and macOS implode while KDE and Gnome slowly get better and better has really been something. Not quite at the point I'd recommend them for grandma and grandpa, but not that far off, either.
I've been using a Mac basically full time for years now, due to work. It's easily the worst UX and it's sort of shocking, after decades of hearing "it just works" or whatever. Hidden windows, hidden desktops, obscure keyboard shortcuts, etc.
I actually don't even know how to use the mac for the most part, I've learned to live in the terminal. I contrast this with Linux where I can just... idk, browse files? Where windows don't suddenly "escape" into some other, hidden environment, where I can just use a computer in a very sane way, and if I want keyboard shortcuts they largely align with expectations.
I was extremely frustrated while on a call using a mac. I made the video call full screen, which then placed it onto essentially a "virtual monitor" (ie: completely hidden). I had no way to alt tab back to it, for whatever reason, and I had no way to actually recover the window in any of the usualy "window switching" means. I knew there was a totally undiscoverable gesture to see those things but I was docked so didn't have access to the trackpad.
I figured out if you go to the hidden dock at the bottom and select Chrome, as I recall, you can then get swapped back over to that virtual desktop, "un full screen" the window, and it returns to sanity.
Mac UX seems to go against literally every single guideline I can imagine. Invisible corners, heavily reliant on gestures, asymmetric user experiences (ie: I can press a button to trigger something, but there isn't a way to 'un trigger' it using the same sequence/ reverse sequence/ 'shift' sequence), ridiculous failure modes, etc.
I can't believe that people live like this. I think they don't know how bad they've got it, I routinely see mac users avoiding the use of 'full screen', something that I myself have had to learn to avoid on a mac, despite decades of having never given it a second thought.
MacOS definitely has its issues but this just makes it sound like you have different expectations of how an OS should work. Different isn’t always bad. Hiding applications is a pretty key concept in MacOS. Shortcuts are pretty straightforward? Cmd+H to hide, Cmd+Q to quit. Spaces aren’t hidden- there’s lots of ways to access them, but it seems you haven’t bothered to learn them. In your example pressing ctrl+right would have switched the first full screen space. You could also have right clicked the Chrome icon in the dock for a list of windows.
BTW the dock doesn’t have to be hidden, and idk if it was a typo but alt+tab isn’t a default shortcut. Command is the key used for system shortcuts, so maybe you should have tried that? Like yeah it’s different but that doesn’t make it bad. If you been using it for 10 years without figuring that out…
—-
I’m with you on the 1st party apps though, and the stupid corners on Tahoe.
I call it "alt tab" because that's how my brain maps the keyboard. The reality is simple - I struggled going from Windows to Ubuntu about 20 years ago but ultimately made it to the other side knowing how to use both well. With macs, I didn't. 10 years later and all of my adaptations are to avoid the operating system. In 10 years the main thing I've learned is how to get myself out of a jam and stick to the parts of the OS that don't feel like shit. I mean, it's not like I haven't learned these things, I know how to gesture, I know how to exit full screen, etc, it's not like I didn't ever learn, I'm explaining that the experience was dog shit.
Anyone is free to claim that I just didn't try, or didn't give it a fair shake, or perhaps I'm just some idiot who doesn't know computers or whatever.
Maybe I just think an OS should work differently, but okay? I've never said that I have some sort of access to a platonic ideal of objective operating systems and that macs don't meet it. I'm saying that I think it's bad and I gave examples of why. And I think I can easily appeal to my experiences seeing others use the OS - I don't think they find anything you're talking about appealing either.
> Hiding applications is a pretty key concept in MacOS. Shortcuts are pretty straightforward? Cmd+H to hide, Cmd+Q to quit. Spaces aren’t hidden- there’s lots of ways to access them, but it seems you haven’t bothered to learn them.
They're not talking about Cmd+H hiding or virtual desktops - those exist on Windows too. The issue is how macOS handles window placement with zero visual feedback.
For example, when you open a new window from a fullscreen app, it just silently appears on another space. No indicator, no notification. You're left guessing whether it even opened and where it went. The placement depends on arcane rules about space layout, fullscreen ordering, and external displays - and it's basically random half the time. You either memorize the exact behavior or manually search through all your spaces.
Years ago, they changed the behavior of the green button to be "fullscreen into a separate space." As someone who never uses spaces, this is never what I want.
You can escape it by moving your cursor to the top edge of the screen and clicking the green button on the titlebar that appears to exit fullscreen.
> Years ago, they changed the behavior of the green button to be "fullscreen into a separate space."
Not quite. It has the old behavior (grow to as large a window as supported) if the app does not support full-screen. For instance, the Settings app cannot grow wider, so it grows to full screen height.
The icon that appears when you hover over the green button reflects whether it is full screen or zoom behavior. If you hold option, you will always get zoom behavior IIRC. However, due to the green button being overridden to be a menu in Tahoe, the button icon may or may not reflect zoom/full screen behavior if you press/release option and may instead show the optional modifier on the options in the pop-up menu.
I do not believe there is a way to disable full screen behavior completely, nor spaces. However, I don't think I'd be able to survive working on a Mac without both so I haven't done a lot of investigation there.
1 reply →
In this case, because I had docked my laptop, the entire window moved to a virtual desktop that didn't actually map to a real desktop. Meaning that the video call continued in a virtual desktop that I literally could not see, that I could not mouse over. I don't know if that's just a multiple-monitor bug or whatever but the behavior is stupid even without that failure mode.
8 replies →
Here, return to sane behavior: https://blazingtools.com/right_zoom_mac.html
You're making multiple desktops sound very confusing when it's really not. Every desktop OS has them and macOS' implementation is quite good. You want bad virtual desktops, try Windows.
Maybe you're better suited for an iPad.
I've used multiple desktops before. I love virtual desktops. They really shouldn't be confusing. It's a testament to the bad UX of macs that they are.
The fact that a full screen window creates a whole new virtual desktop is hilarious and I dare you to justify it.
Appeals to "Windows is bad" or whatever mean nothing to me. Stupid comments like "get good" mean nothing to me.
6 replies →
Love Linux, been using Manjaro with Gnome for the last 10 years, but need to use Mac on my current job, so I tried to approach this constructively and work around the rough edges: * Rectangle Pro for window management * Better Display for better picture on non-4k display + a couple of more similar tools + retrainig muscle memory from Ctrl to Cmd and Emacs-y instead of Windows-y shortcuts
Feels okay now. Plus native ms365 apps, smooth sleep mode, great hardware and great battery time -- mac has its sweet spots as well.
> I figured out
Or you could maybe learn how to use the OS, in linux lingo RTFM. I don't want to be rude, but the critique was very flippant, the arguments vague, all about expectations based on years using a different OS, doesn't seem you want to give it a fair chance.
This is pretty funny.
> the arguments vague
I gave both generalized and highly specific cases where I felt the UX failed. I referenced principles of UX as well as literal "here is what my experience was in a concrete story".
> , all about expectations based on years using a different OS
No? I mean, again, funny. I explained how I've been using MacOS for years. Actually a decade, now that I count it out.
> doesn't seem you want to give it a fair chance.
a decade lol
2 replies →
And if you bring up these points to an Apple fanboy, they'll tell you that "you just don't get it" or "forget all the 'bad Windows habits' and just learn the Apple way of things. It's soooo intuitive!!".
> "forget all the 'bad Windows habits' and just learn the Apple way of things
I mean I'd be willing to say I don't get it, because I sure as fuck do not get it. But I think I'd absolutely reject the "forget all the other stuff, learn this". It's been literally years on a Mac. I remember the frustration of going from Windows to Linux, I look back at that adjustment and laugh, it's hilarious to me that that felt frustrating when I contrast to my Mac adjustment. At least the Linux adjustment was tractable, the Mac adjustment is a total joke.
I actually suspect that people don't "adjust" in the sense of learning how to do things with a mac but instead adjust to not doing things with a mac, like how many mac users I know of outright say they just don't use full screen mode because it's confusing.
3 replies →
Personally replaced Windows 10 with Linux Mint on my very computer illiterate mother in law's laptop a few months back. Haven't heard any complaints so far.
Linux is ready for prime time for anyone not bound to Windows/MacOS software.
Personally, I'm still on MacOS for work, but all my personal devices run some form of Linux. It's been liberating to say the least.
I set up windows 11 on a laptop for my dad so he can read emails and browse the web. Came back 3 months later when he told me he couldn't see the PDF files anymore. Turns out he installed THREE different PDF viewers that he randomly found on google, they installed tons of bloatware/spyware, replaced browser toolbars and searches etc. to a point where I decided to just restore from a recovery point. Told him not to download weird stuff (again) and ask me when he needs help.
At that point I questioned myself: I really should have installed linux for him.
> replaced browser toolbars
This is still a thing? Browsers still have toolbars???
My go to for family is giving them no install rights, and adding a remote desktop app for me to connect to them when they need something to install.
I don't get called very often anymore, and when I do, it's for their work computer or something, to which I say, talk to your IT department, I can't fix that.
ChromeOS is a really great option for "just want to read emails and browse the web".
1 reply →
Browsers today view and can do limited editing for PDFs. No need for a dedicated reader. One does need a dedicated authoring tool if you need to create PDFs from scratch. Most OSes support print to PDF as well if you only need conversion.
My daughter did this for her boyfriend's grandma, except she used Kinoite. The immutable aspect of it makes it very difficult to break.
She was over there recently and the downloads folder was littered with malware .exe files, so the grandma is trying her hardest to break it.
UBlock origin will fix most of that problem.
3 replies →
> Linux is ready for prime time for anyone not bound to Windows/MacOS software.
I suspect in order for this to be true we'd need a PR campaign that can shift culture on the scale of civil rights.
I'm not trying to be hyperbolic or deride Linux or anything—I agree that technologically it's probably ready. Overall UX I'm slightly skeptical. But the far bigger problem is culture.
There's already been a shift away from "PCs" among younger people. The majority of my kids friends have never touched a "regular computer." I've heard an unsettling number of reports of new hires who have never heard of a spreadsheet.
I'm bringing this up because if kids aren't using PCs as much in the first place and quite literally don't know what an operating system is (and please challenge this assumption; I'm going off of anecdata) it's going to be even harder to try to create cultural awareness and acceptance of linux.
But even disregarding that there would need to be a massive, massive coordinated campaign to create a real culture shift. I'm talking superbowl ads.
Again, not trying to be pessimistic, I'm trying to say that "ready for prime time" at this point has little to do with engineering or even design and far more to do with PR. Once I started launching my own products I quickly discovered (as everyone does) that making the thing is like 5% of the job and the remaining 95% is marketing.
The frustrating thing is that developers are some of the most reluctant to change. I'm sick of fighting docker on my Mac among the many other problems. But if we can't break away nobody else is going to either.
I mean yeah, Chrome and Firefox both run on Linux. And that covers 99% of what most "normies" need.
It's funny when people say Linux is difficult for their grandparents or siblings, when that's the place it covers best. And it keeps them from calling you about random adware/spyware/viruses they accidentally installed.
It's prosumers and professionals that have more issues with Linux, because they tend to rely on proprietary software that's problematic to install/use.
Before she passed, I had one of my Grandmothers on Ubuntu for about a decade... I had to set it up for her, and I ran updates every few months for her, but she really didn't have an issue... Her Windows 9x era games even ran under Wine when they wouldn't load on Windows (7 I think), correctly.
Email, browser and a few games... she was pretty happy with it.
I was so close to getting my parents to switch to Ubuntu in the late 2000s. It stuck until my dad needed some piece of software on the home PC for work that only worked with Windows. Today, they have iPhones and they think it will be more convenient to have a Mac to "sync things". Oh well...
Today, they have iPhones and they think it will be more convenient to have a Mac to "sync things". Oh well...
And for a very long time they would have been right. But it seems that all the commercial desktop OSes are in the maximize money extraction-phase now.
Gnome Shell in particular offers a ridiculously coherent, sane window management. Nobody agrees with all the choices the Gnome Team took to get here, but it sure is nice there being one way of doing everything that makes sense contextually.
I don't even know if Gnome and Gnome Shell are the same thing. One thing I do know is the default install of Gnome on Debian 13 leaves you without a dock, without a system tray, and without minimize/maximize buttons. They purposely remove the three most important tools the average user relies on for navigation.
It's like trying to make a car without any round edges because "square edges are better". Good luck with the wheels!
I can fix that somewhat with extensions, but every normal person I know will take one look at the defaults and abandon it. That's a reasonable choice in my opinion. Why use something where the first interaction gives you a clear indication you're going to be fighting against developer ideology?
I agree.
If you want to customize your DE a lot - Gnome isn't for you.
If you just want a clean and productive environment by default... Gnome is great.
Once you stop fighting it, sigh, and go with the flow... modern Gnome is genuinely pleasant in that I spend almost zero time thinking about it, and shit just works.
I still run other DEs for some specific purposes where "general use" isn't the goal, but I can reliably hand non-technical family members a machine with Gnome and they don't have to come ask me a bunch of questions.
My problem with GNOME (after having used it as my main desktop on my Linux systems for many years) is that it removes some really useful features and they are not just expert features, but also features that non-technical users are used to, such as system tray icons and menu bars. You can bring them back with GNOME Extensions, but for instance, the system tray icon extensions are very buggy.
KDE on the other hand just has these and is also great out-of-the-box (I pretty much run stock KDE).
Even gnome tries to be too modern imo. KDE is perfect. I used to feel like KDE was too much like a toy. Now by comparison it looks utilitarian.
I've been using KDE for a decade and I completely agree. It used to be only better than GNOME because I could remove features from it and now I run completely stock KDE and it's solid compared to anything else.
I bought an SBC that booted into Gnome on the official disk image, and it didn't recognize my mouse. It was entirely unusable. In applications that were part of Gnome itself, like the settings menu, it was impossible to navigate using tab and arrow keys.
>settings menu, it was impossible to navigate using tab and arrow keys.
Huh? All you need is tab and the arrow keys to navigate the GNOME Settings app. I'm literally doing that right now. Maybe it was a later addition but it works perfectly fine in GNOME 49.
I think you can absolutely set up a Linux box for grandma / grandpa.
Anyone who lives in the browser really. My mom and my kids all are on Ubuntu these days.
Anyone who lived in a browser was fine a decade ago.
At this point... it's basically anyone who doesn't want to play competitive mp games with poorly implemented anti-cheat, or who doesn't have niche legacy hardware (ex - inverters, CNCs, oscopes, etc).
Steam tackling the gaming side of things has basically unlocked the entire Windows consumer software ecosystem for linux. It's incredibly easy to spin up windows only applications with nothing but GUI only software on most distros at this point.
Crazy how much better a system with a modern linux kernel and Gnome or KDE is than Windows 11. I'm at the point where I also prefer it to macOS... which is funny since I think Gnome was basically playing "copy apple" for a bit there 5 years ago, but now has really just become the simpler, easier to use DE.
In the past few years, I’ve started to develop a form of “upgrade dread” when it comes to OS upgrades. What are they going to enshittify now? What are they going to drop support for now?
This somehow excluded Linux and its DEs, and I eagerly read any news, changelogs, and announcements in this space. They’re still not perfect in every aspect, but at least I see things improving instead of public turf wars between departments trying to improve their KPIs.
Why is there an extra URL handler for MS Edge that bypasses the default browser config? Why is the search bar this wide in the default taskbar config instead of showing a simple button? Why are local searches always sent to Bing with no easy way to switch it off or change the search provider?
> I’ve started to develop a form of “upgrade dread” when it comes to OS upgrades.
I've been going the other way on Linux.
I used to think it might be wise to postpone updates if you were traveling, especially using a rolling distro. Today, I would be quite confident running the updates 10 minutes before leaving.
Granted, this is also because I'm more confident than ever that I could fix most breakages, and worst case the smartphone is there, but I've also not seen big breakages for years.
I have a somewhat opposite experience. I also use a rolling distro, and in the past six months, I've seen wine break, and I've also seen Citrix Workspace break due to a dependency problem (perhaps Mesa?). Granted, these two cases are somewhat unusual because Citrix Workspace is closed source and the software I'm running with wine is also closed source. I rarely experience breakages of open source software other than GNOME extensions.
Yep. I run NixOS unstable-small on my ThinkPad and there is rarely breakage in daily updates. If it ever happens while on the go, I can just boot into a previous generation. The immutable OSTree/bootc distros are similar, as well as openSUSE, which uses btrfs snapshots on updates.
[dead]
In all fairness I wouldn't recommend macOS to my grandparent either.
Given that a lot of things happen in the browser, I think it wouldn't be too crazy. There are even distros that look like Windows if you're after that. What part of it do you think isn't ready for this scenario? (honestly curious)
I wouldn't know what to recommend for "just works" photo syncing from the phone à la iCloud.
There are no good options for grandma these days. I've been helping my 85-yr-old mother with her computer stuff (she has an iMac) and there's so much user-hostile, broken stuff--not just on the Mac itself, but many of the internet-based services she has to use--it makes you want to take a baseball bat to the while affair.
I set up Elementary OS for my 79 yr old mother. No issues.
Similar experience here: I setup Debian stable for my 76 yo mother, and for a 79 yo friend. Works like a charm, and the 2 years release schedule is perfect for people who don’t care about bleeding edge and would rather have stability.
Unattended security upgrades keep it secure, and in my experience a bit of initial “locking things down and simplifying” is valuable, but after this it’s smooth sailing compared to other older folks I help with Windows systems where MS is constantly throwing at them insane bugs, complete UX changes, ads, or Copilot everywhere.
You’ve never tried Haiku, you’re missing out on a remarkable desktop experience.
If you want to compare on the basis of microissues like this one, then note that KDE Plasma has exactly the same issue with the resizing area of rounded corner windows aa the one pointed by TFA.
On the other hand, it does have Alt+right click & drag as a mechanism that doesn't require any manual dexterity to hit arbitrary edges.
Oh yes. Alt+right click + drag. How intuitive. (not)
1 reply →
This has to be sarcasm. Either that or you have never used KDE or Gnome even once in your life. No DE for Linux is anywhere near as polished as the DE in Mac OS. You have to spend hours customizing KDE or CFCE to get them to function even halfway near what an average user would expect. Gnome is okay but so bloated and even more opinionated than MacOS or Windows.
This is definitely not the case, and I invite anyone reading this comment to install a Linux distribution themselves in a VM or something to find out via direct experience. Fedora is a good place to start in my opinion.
Window management mostly works fine, but app design is years behind.
KDE Dolphin has a static toolbar like Finder, with its config menu being two lists like some Microsoft toolbars, and the available items list is sorted alphabetically.
The flat view switcher is multiple separate items, named directly after their corresponding view type, one called list, another called icons and so on.
So if you want a Finder style view switcher, you first need to know it exists beforehand because the naming is confusing, then you need to know how many views are available beforehand because they're separate items, and finally you need to hunt them down because the list is alphabetical.
This is pretty much the quality you can expect when using KDE software.
Another example is breadcrumbs, the current folder doesn't have an arrow, so you can't browse deeper with it without perhaps expanding folders, unlike on Windows 7. Side bar favourites also replace the top folder, so if you browse the home folder with it you'll often find yourself suddenly unable to use it.
The main problem is that Apple wants to be opinionated. Linux is the polar opposite of that. People used to say the latter is bad, but it turns out the former is way worse (many hackers of course already knew this).
> Not quite at the point I'd recommend them for grandma and grandpa, but not that far off, either.
But at this point grandma and grandpa are the only ones I'd recommend to use Apple devices.
Opinionated design was great back when Apple's Human Interface Guidelines were based on concrete user testing and accessibility principles. The farther we get from the Steve Jobs era, the more their UI design is based on whatever they think looks pretty, with usability concerns taking a back seat.
It was good because it was both Opinionated (in other words, the path to write software that follows the design was easy, and the paths to write software that violated the design were hard), and also well-researched by human interface experts.
Now what we appear to have is "someone's opinion" design. A bunch of artists decided their portfolios were a little light and they needed to get their paintbrushes out to do something. I don't work at Apple, but my guess is that their HI area slowly morphed from actual HCI experts into an art department, yet retained their power as experts in machine interaction.
So here we are, we still have Opinionated design, but it might just be based on some VP's vibes rather than research.
1 reply →
And ironically, it has also gotten far less pretty. Mac OS X 10.4 Tiger was beautiful. Tahoe is flat and generic looking.
Opinionation (heh, opinionatedness?)'s value is entirely different depending on the user category.
Hackers by and large don't want opinionated, because they're willing to spend the time configuring & customizing AND have the knowledge to do so.
Just about everyone else (as far as I can tell) very specifically do not want this, and for those who do, the amount of customizeability e.g. MacOS offers is enough. Having an immediately-useable computer (recent problems notwithstanding) is of much greater value.
So when you say "The main problem is that Apple wants to be opinionated" I can only conclude that you're coming at this from the 'hacker' POV. But I may be misunderstanding your comment.
I think the problem is that opinionatedness assumes that the average user exists and represents the majority of your users.
But every user is in many ways non-average.
Thus if you create a system tailored at the average user, then none of your users will be happy.
1 reply →
Of course I've been using Cosmic for most of the past year now... It's getting better, but still some rough edges... the launch bar still doesn't feel quite right, and there's still times where keyboard navigation doesn't quite work right/smoothly.
It's speedy though.
> "...while KDE and Gnome slowly get better and better"
These projects have been around for literally decades and really haven't changed much during that time. I think what you're noticing is that Linux desktops are as good as they always have been, but since Apple and Microsoft keep messing with theirs for marketing reasons, in comparison it seems that Linux GUIs are improving.
Gnome has improved significantly since the difficult Gnome 3 launch, and KDE Plasma was a massive upgrade that keeps getting better all the time.
This feels untrue. Granted I haven't tracked it closely, but the Adwaita design system and the GNOME HIG feel like relatively recent developments.
This is just not true at all. Yes Gnome and KDE are old, but they've changed SIGNIFICANTLY.
Gnome 2 => 3 was a bigger and more ambitious transition than anything Microsoft has done. Except maybe DOS => NT. Same thing with KDE 3 => 4.
KDE gets new features on a very regular basis and they're not just, like, little checkboxes added here or there. No. Theyre huge changes. New system resource monitor, new notification center, new widget editor, new panel editor, window tiling... the list goes on. And that's just, like, the past 2 ish years.
Linux GUIs are improving, and rapidly. Before, they were close. But the gap keeps widening. At this point, KDE is so unbelievably far ahead of windows in terms of UI, UX, usability, performance, and feature set that it doesn't seem fair. I don't know if Microsoft can catch up. And, if they could, it would take multiple versions of windows.
I disagree.
I've actually bought a Mac Mini which I use for media consumption and run it besides my Linux (Cachy OS) gaming PC. I have a jellyfin server, but the media client for linux is totally broken.
And, when you use an nvidia card, you really have to do a deep dive on which settings and which render client you want to run. I now have a stable solution that runs KDE Plasma via Wayland, that allows for games to run smoothly. It took me a while to figure that out.
The Linux community also, quite frankly, sucks. When you need to figure something out, you really need to make it a study and only if you know the correct jargon, you are deemed worthy of help. Othrwise you're bombarded with rtfm comments.
actually hunting for i9 macbook in good shape to switch to linux after decades on mac
As long as you stay far away from Wayland, flatpaks, and nVidia drivers
Wayland and flatpaks work perfectly fine. nVidia drivers on the other hand...
My mother (age 70, non technical) uses Gnome with no issues.
Gnome is just perfect for non-techies :)
my mother and younger sister both prefer it over default Windows 10/11 design. mum says, "feels similar to my phone [pure Android 12] yet I can do so much more".
given that sister only really needs Steam Big Picture and everything mother uses is already in Flathub or defined in a Nix flake, they didn't experience any ecosystem issues
I don't love all the new tahoe stuff, and do wish I could go roll back, but this hand wringing around Apple is way overblown IMHO. What he is reporting is real, but in my actual usage I haven't noticed this at all- in other words, if this wans't called out, I am not sure I would have ever realized it.
Tbh I have always found window management on Macs to be annoying and something to be avoided- Rectangle or something similar is one of the first things I install and try to use the shortcuts to just put windows in either a quarter or half of the screen.
That said, I use Macbooks for the hardware, if for whatever reason I had to switch to Linux I would just shrug and not care one bit. It took me a few years to realize, but MSFT just disappeared from my life one day and I didn't even notice.
Also, for some reason KDE renders everything super-fast/smoothly on my 120Hz 4k display, whereas macOS on Apple Silicon is often stuttering (no, it's not the Electron bug). The tables really turned, when I first switched to macOS on the desktop in 2007, the GPU-based rendering was insanely good compared to... pretty much everyone else.
Rather than evolutionary improvements we get Liquid Glass and ads in iWork applications. The enshittification has started I guess.
[dead]
Sorry but you clearly haven't used macOS. Linux on the desktop is still about 15 years behind, and I tried it recently. It's such an inconsistent experience it's almost hilarious.
Speaking as a Tahoe user by the way who is not experiencing any issues to speak of (on 26.0.1 - and I can't reproduce the resizing inconsistency either). I've been using macOS since 2003 (back when it was called Mac OS X) and before that I was a Linux desktop user since 1996.
I used macOS as my daily driver from Tiger to last year, actually. I don’t know what the inconsistencies you’re referring to are, but I certainly prefer them to cloud account nagging and constant attempts to monetize user behavior, which is the modern macOS experience.
which desktop experience did you try
i'm a daily mac os x user (for a long time) and I think kde plasma is better
I'd be curious to hear more specifics regarding the "15 years behind" and "inconsistent experience."
Inconsistent experience maybe, but does this inconsistency really get in the way of actual work?
If you're a developer or sys admin, sure. Or nowadays, if you're a gamer.
If your computer work is anything else, Macs are still decades ahead. With the highest quality software available for any task at cheap prices.
I can't work with a sub-par e-mail client, calendar, no good invoicing app, photo editing, etc.
And web apps do not cut it if working with these things is your job.
As for grandma and grandpa, iPad is their solution. With all the faults of the devices.