Comment by packetlost
10 months ago
I think it has more to do with a gradual industry-wide race to the bottom in terms of quality. Reliability, attention to detail, correctness occupy a tiny fraction of the "budget" compared to security, slopping out features, and beating competition to market. I suspect that startup culture being the crucible where a large portion of engineers learned their chops and the massive amount of new blood in the industry who are primarily there for money are the biggest factors.
I concur. To add, I wonder how much of the “old guard” is still at Apple? Apple used to be perfectionistic when it came to software, even during the 1985-1996 interregnum when Steve Jobs was absent. Besides Steve Jobs, Apple also had people like Bruce Tognazzini and Don Norman who cared deeply about usability. When Apple purchased NeXT and built Mac OS X, Apple’s usability focus was married to reliable, stable infrastructure, culminating with Mac OS X Snow Leopard, which I believe was the pinnacle of the Mac experience. (Though I’m partial to the classic Mac OS from a UI point of view, Mac OS X had a better UX due to its stability.)
I suspect a lot of Apple’s decisions in the past decade regarding software is due to an increasing number of Apple employees who are not familiar with the philosophies of 1970s-era Xerox PARC, the classic Mac, NeXT, and Jobs-era Mac OS X. Granted, it’s possible to be too introspective, too focused on the past. Unfortunately Apple’s software is losing its perfectionistic qualities, which has long been the selling point of the Mac compared to Windows and Linux.
I think you have rose colored glasses on. System 7-8 at least were crash proned disasters and the 68K emulator was so bad on the first gen PPC computers you basically had to use SpeedDoubler - a much better third party emulator.
Half the OS was still running under emulatiom
> I think you have rose colored glasses on. System 7-8 at least were crash proned disasters
I think that is what was said:
> (Though I’m partial to the classic Mac OS from a UI point of view, Mac OS X had a better UX due to its stability.)
Linux seems like the opposite to me a slow marathon to achieve perfection. With pipewire, systemd and wayland there's less cruft than ever and you get the best out-of-the-box experience since it's inception.
Woah now, saying something positive about systemd will bring a bunch of crusty greybeards out of the woodwork who want their Linux to be as close to BSD4.4 as possible.
Jokes aside, I'm in agreement. Audio was still slightly buggy for me using a Elgato XLR USB interface, but it consistently worked with annoying workarounds. Linux is in a very good place for even normal consumers these days, I'm hoping Valve ends up making SteamOS a generalized gaming platform that will pull more market share away from Windows in that specific niche. I'm so ready.
I don't get the systemd hate, as a user I find it quite nice. Centralized place all services live and I can see all the stuff I use and need. Good CLI for inspecting services and getting logs.
But like, I don't manage linux servers and stuff so I am sure it sucks in certain very specific ways for people who need to deal with it day in and out.
I remember my young days of using Slackware with init.d. That was hell.
2 replies →
Did pipewire actually build in their pulseaudio and JACK emulation, or is it still acting as a shim between already-running pulseaudio and JACK daemons?
Also, (FWIW) I've a fine time with JACK2, openrc, and xorg. I had to do some manual work to tell JACK which sound card to use and to set up the pulseaudio backfill for software that doesn't know how to speak to JACK, [0] but everything else just works.
[0] The "tricky" part was disabling all pulseaudio backend modules but the JACK backend. This was -of course- not tricky at all.
It's a reimplementation of a pulseaudio server and a reimplementation of the jack client library. See https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/FAQ...
1 reply →
I have the occasional annoyance like "VLC has choppy audio for a few seconds after I seek," and "Gnome has gone full douchebag with notifications for everything and removing all the settings."
Other than that, though, Ubuntu on any old laptop (expensive thinkpads are my favorite) is my go-to daily driver. Except at work where I'm learning to deal with a (new, shiny, powerful) Macbook that I will use to... connect to a Linux VM because that's the only way to work on our software. Seriously, a whole fleet of zillion dollar macbooks so we can all ssh into beefy VMs to build/test/deploy on Linux.
IT onboarding made a point that if you want to get a Windows laptop and wipe it for Linux, you need permission and a "good reason." How about "this is stupid just let me work on stuff." Of course it's about tech support and security, which is fair enough but I feel like they have it backwards. Support Linux and then require special permission for the $4000 ssh client...
After spending a couple of days with homebrew and building some things natively on aarch64, though, I might make a hobby out of moving stuff local. It really is a beautiful machine.
IMO the main argument for devs to use linux is that Docker can run without a VM (and without Docker Desktop which is now payed) in linux. If you do docker stuff with any sort of frequency it will save years-worth of your time.
> a gradual industry-wide race to the bottom in terms of quality
I'm going to disagree. This is a false nostalgia.
15 years ago the market for consumer laptops that were not MacBooks straight up sucked. If you walk into a Best Buy today, almost any laptop you buy is going to blow any laptop from back then out of the water in terms of build quality. And credit where it's due, in no small part it came from playing catch up with Apple.
I am not referring to hardware. Hardware quality has largely improved, software quality has largely gotten worse.
I think there was a sweet spot in the late 2000s and early 2010s, more specifically, the Windows 7 and Mac OS X Snow Leopard eras.
On the Windows side of things, this was when Microsoft got serious about security, with plagued earlier versions of Windows XP (worms were so rampant around 2005) until later service packs helped fix things. Windows 7 was solid and performant. While my favorite version of Windows is 2000, 7 was another high mark for Windows.
Much has been said about Snow Leopard, but it was the pinnacle of Mac OS X, the refinement of an already great OS, Leopard. I would gladly used Snow Leopard today if it weren’t for needing current web browsers and up-to-date security patches.
Even the Web was better back then. By 2008 many mainstays of modern Web life, such as social media and YouTube, were already in existence. Google was excellent. Internet Explorer’s dominance was successfully challenged, and there was an ecosystem of standards-compliant browsers (later IE versions, Firefox, Safari, Chrome, Opera). Web developers were coding to standards instead of only writing for one browser. Yes, ads existed, and there was also malware, but ads were less intrusive, and malware can be avoidable with more careful browsing.
I miss 2009- and 2010-era computing, when Windows and Mac OS X were at their peaks, when the browser ecosystem was diverse, and when many commercial websites like Facebook were still pleasant to use.
Yea I totally agree. This is selective memory.
Perhaps there were peaks and troughs in individual technologies. Late 2000s / early 2010s felt like a good time for operating systems, for instance.
But is everyone forgetting having to navigate through Flash websites and Java Applets using Internet Explorer, for instance?
Also, people are just forgetting. There’s nostalgia in this thread about the iTunes desktop app, for instance. That program has been a pile of trash for as long as I can remember back in the 2000s.
iTunes is one of the best software of all times, you are crazy. It existed before even OS X was a thing (under another name but still).
It only became "problematic" when they tried to overload it too much to be able to "support" Windows for the iPod/iPhone without having to develop dedicated software.
They largely killed it and the replacement is lackluster. The best version was around version 10-11 with the colorized album view.
To this day there are no audio library management software that come close to what iTunes was. Apple Music, being a fork, is the closest thing, but it's not really the same thing at all.
As someone who recently walked into a Best Buy with a family member and bought a laptop, I respectfully disagree.
All that store sells is hot garbage.
> build quality
Tell that to Dell and their shit trackpads and prone to death battery charging circuits. And the joy of soldered RAM so you cannot upgrade can't be overstated enough.
I think regarding the combination of usability and stability the Win XP/7 era was still unbeatable.
WinXP was just an ugly face on Windows 2000 Workstation without an EOL version of DirectX for gaming.
1 reply →