Comment by swat535
10 months ago
It's not just Apple guys, it's everywhere.
Software quality has seriously declined across the board, from Spotify to Slack to core operating systems like Windows and macOS. I think a major factor is corporate culture, which largely ignores software quality. Another issue is that many engineers lack a solid understanding of CS fundamentals like performance, security, and reliability (perhaps this is why many are not able to solve basic algorithmic questions like linked lists or binary trees during interviews)..
I've seen code written by so-called "senior" engineers that should never have made it past review; had they simply paid attention in their CS 101 courses, it wouldn't exist.
On top of that, as long as poor software quality doesn’t hurt a company's bottom line, why would executives care if their app takes 20 seconds to load?
Consumers have become desensitized to bloat, and regulators remain asleep at the wheel..
There are plenty of us that would love to just sit and fix things all day, but then you get a poor performance review for not shipping new features and find yourself out of a job :)
I wonder how can I join MSFT or Apple just to fix stuffs? Don't care about salary as long as it's on par with my current one.
That's not how Apple works. You'd be given requirements specific to your team and expected to implement them. End of story. You wouldn't be empowered to seek out other teams and fix their stuff (or even necessarily talk to them). It's deliberate and intentional to have very few people with that cross-functional power.
3 replies →
I thought this once, it's a disappointing experience. You'll hear all the right things, and 3 years in, realize nothing you do matters to anyone, and that's because all the managers who were so excited about your passion for software quality haven't met with you in a 2 years. And then it clicks, they got promoted by knowing the game: features, resembling the rushed planning deck, delivered yearly. (This is a whole lot easier to whine about after banking the salary for 7 years, of course)
5 replies →
If you got hired on as an L4, didn’t mind missing out on promotions, and were ok working with your manager it could definitely work.
1 reply →
Yep, fixing bugs/performance is just not valued by big companies
No one gets promoted doing it
Honestly, I don't think it's a culture thing or a CS fundamentals thing.
I think it's the fact that software is 100x, or maybe even 1,000x, more complex that it was just 25 years ago.
Apps are built on so many libraries and frameworks and packages that an average application will have 100x the amount of code. And they're all necessary. A typical phone app is 200 MB, when Word 4.0 was less than 1 MB.
But it's not just the sheer number of lines of code that can have bugs. It's the conceptual complexity of modern software. An app has to have an interface that works with the mouse and with touch, large screens and small screens, regular resolution and retina, light mode and dark mode, it works offline and online with sync functionality, it works in 20 different languages, it works with accessibility, it works with a physical keyboard and an on-screen keyboard, over mobile data and over WiFi, it works with cloud storage and local storage, it goes on and on.
There are so many combinations of states to reason about. When I was building software for Win32 back in 1995, you worried about screen sizes and color depths. That was about it.
Software's just gotten incredibly complex and there's more to reason about and software quality is just going to suffer. Like, I love Dark Mode at night, but at the same time I can't help but wonder what bugs would have gotten fixed if engineering resources hadn't gone into, and continue to go into, Dark Mode for each app.
> And they're all necessary. A typical phone app is 200 MB, when Word 4.0 was less than 1 MB.
On native platforms, no it’s not.
I know this for a fact because I maintain moderately complex, functional phone apps that have binary sizes that sit below the 30MB mark. I use multiple desktop and mobile apps from other developers that also match this description.
The cause of the bloat there can be attributed to the following things, mostly:
- Apps including gobs of poorly optimized analytics/marketing garbage
- Third party libraries unnecessarily including their own gobs of poorly optimized analytics/marketing garbage
- Apps being wrappers of a web tech stack project built by devs who have zero dependency discipline, resulting in a spiral fractal tree of dependencies that takes up way more space than it needs to
Engineers who care about good engineering are pretty much a thing of the past. Today the game is buffing your resume with as many complex tools as possible, and jumping to the next thing before your pile of complexity crumbles under its own weight.
Some rose-tinted glasses looking at the past. There was a time when your entire computer would crash if you just looked at it funny.
Raw stability of software is much higher -- there are just more minor annoyances because there is also much more software.
The reason everything is built on millions of layers is not because it is actually required, but because we have invested a whole lot of time in building frameworks so that mediocre programmers can get fast results.
I would call it a culture issue, where we are not able to seperate out places where this is fine, new interesting apps are great, I want as many as possible.
From places where it's destructive, I would be happy if none of the ways I interact with an os had changed since windows 7, but it had gotten faster, more secure, and resilient.
MacOS had more screen sizes to target in 2011 than the iPad does today; in any case, Apple has always tolerated having iPad apps that are blown-up phone apps. Mouse support for iPad apps has existed as an accessibility feature before it was deemed a core feature. Even that isn't any kind of technological leap, mouse support has been part of Android for at least 15 years now.
None of these really explain the sloppiness and unfocused nature of Apple software, which has been best-in-class until recently.
Except those iPad apps also have to have a Web app now, and if you don't have a custom MacOS app your iPad app has to look good when run in MacOS. You then have to support all iPhone models. But also maybe Windows and probably Android. 25 years ago you could slap "IBM PC Compatible" on software and basically design for like 5 color depths and maybe a few resolutions.
Update cycles were on the order of a year, not a week (which also means all new features need to be ported to all the platforms above in that timeframe). Not even mentioning the backend infrastructure and maintenance to run and sync all of these when 25 years ago everything was on your local hard drive and maybe a floppy disk or CD-R.
I lean toward "culture" as the problem. Although, allowing for your 100x or 1000x complexity, how much of that complexity is from feature pile-on?
I imagine putting AirPlay in the software stack, just as an example, caused code perturbations all over the place. Sidecar feels like another disruptive feature. Never mind Catalyst, juggling Swift and C binaries, Swift UI....
This stuff Apple brought upon themselves. I'm sure there will be plenty of opinions though as to whether some of these were worth the cost of disruption, on-going maintenance.
I agree. The frameworks and tools we use are so complicated, but we’re also so tied to the complexity that it’s pretty much an anti pattern to go outside the framework/toolkit.
I haven’t fully thought this idea out, but I’ve been feelin it recently.
Left pad.
> And they're all necessary
No. No they aren't.
Agile. Sprints. Firing QA departments.
I see these trends as negatively impacting app quality.
"User pain" as a euphemism for "lowest common denominator" apathetic / fear driven development.
Like, playing the Vulcan game of Kal Toh where you remove rods unintelligentlly and still believe your constructed structure (the app) is fully coherent, and instead it dissolves into uselessness.
This.
I’d like to hear from the HN comments. Does anyone here work for a modern and popular software company (something I might have used recently) and think that the software they make is really and truly bulletproof? Like no backlog of hundreds of unfixed bugs and polish items that won’t stop growing?
I don’t think I have met anyone who works at one of those places yet. I’d like to.
The SQLite people post here sometimes
Yep, like the new Sonos app a few months ago.
All of that can be summarized with Electron, web developers and high availability of workworkforce with somewhat low salary…
Except none of Apple stuff is Electron based, and as far as I am aware of Apple salaries are competitive with top companies - so none of your arguments really hold up.
My reply was mostly for the parent comment.
Apple software is still top tier when you start comparing to Slack, Teams, and all the non-native friends. Apple Music does not take close to 1GB of RAM. There are very few native applications these days because of the cost. And lower cost availability is based on the web stack and lower entry level of skills.
MacOS may have bugs but in general they are well engineered. Starting from secure enclave that none of the competitors have, or just raw performance and battery life that is not just hardware. I haven’t seen a single bug in my Watch for over a year. I guess it depends what you use.
The most bugs that we see these days are originating from the choices behind the tech stacks. Python and pure JavaScript are still too popular. Every post here with Rust name on it gets attention because of its promise of some level stability reduced resource footprint.
4 replies →
[flagged]