Comment by caseyy
3 days ago
There is an argument to be made that the market buys bug-filled, inefficient software about as well as it buys pristine software. And one of them is the cheapest software you could make.
It's similar to the "Market for Lemons" story. In short, the market sells as if all goods were high-quality but underhandedly reduces the quality to reduce marginal costs. The buyer cannot differentiate between high and low-quality goods before buying, so the demand for high and low-quality goods is artificially even. The cause is asymmetric information.
This is already true and will become increasingly more true for AI. The user cannot differentiate between sophisticated machine learning applications and a washing machine spin cycle calling itself AI. The AI label itself commands a price premium. The user overpays significantly for a washing machine[0].
It's fundamentally the same thing when a buyer overpays for crap software, thinking it's designed and written by technologists and experts. But IC1-3s write 99% of software, and the 1 QA guy in 99% of tech companies is the sole measure to improve quality beyond "meets acceptance criteria". Occasionally, a flock of interns will perform an "LGTM" incantation in hopes of improving the software, but even that is rarely done.
[0] https://www.lg.com/uk/lg-experience/inspiration/lg-ai-wash-e...
The dumbest and most obvious of realizations finally dawned on me after trying to build a software startup that was based on quality differentiation. We were sure that a better product would win people over and lead to viral success. It didn’t. Things grew, but so slowly that we ran out of money after a few years before reaching break even.
What I realized is that lower costs, and therefore lower quality, are a competitive advantage in a competitive market. Duh. I’m sure I knew and said that in college and for years before my own startup attempt, but this time I really felt it in my bones. It suddenly made me realize exactly why everything in the market is mediocre, and why high quality things always get worse when they get more popular. Pressure to reduce costs grows with the scale of a product. Duh. People want cheap, so if you sell something people want, someone will make it for less by cutting “costs” (quality). Duh. What companies do is pay the minimum they need in order to stay alive & profitable. I don’t mean it never happens, sometimes people get excited and spend for short bursts, young companies often try to make high quality stuff, but eventually there will be an inevitable slide toward minimal spending.
There’s probably another name for this, it’s not quite the Market for Lemons idea. I don’t think this leads to market collapse, I think it just leads to stable mediocrity everywhere, and that’s what we have.
This is also the exact reason why all the bright-eyed pieces that some technology would increase worker's productivity and therefore allow more leisure time for the worker (20 hour workweek etc) are either hopelessly naive or pure propaganda.
Increased productivity means that the company has a new option to either reduce costs or increase output at no additional cost, one of which it has to do to stay ahead in the rat-race of competitors. Investing the added productivity into employee leisure time would be in the best case foolish and in the worst case suicidal.
Which is why government regulations that set the boundaries for what companies can and can't get away with (such as but not limited to labor laws) are so important. In absence of guardrails, companies will do anything to get ahead of the competition. And once one company breaks a norm or does something underhanded, all their competitors must do the same thing or they risk ceding a competitive advantage. It becomes a race to the bottom.
Of course we learned this all before a century ago, it's why we have things like the FDA in the first place. But this new generation of techno-libertarians and DOGE folks who grew up in a "move fast and break things" era, who grew up in the cleanest and safest times the world has ever seen, have no understanding or care of the dangers here and are willing to throw it all away because of imagined inefficiencies. Regulations are written in blood, and those that remove them will have new blood on their hands.
52 replies →
Yes, this is an observation I've made about the illusion of choice in so-called free markets.
In actuality, everyone is doing the same thing and their decisions are already made for them. Companies don't just act evil because they are evil. They act evil because all they can ever be is evil. If they don't, then they lose. So what's left?
Facebook becoming an ad-ridden disaster was, in a way, predestined. Unavoidable.
> 20 hour workweek etc
We have that already. It's called part-time jobs. Usually they don't pay as much as full-time jobs, provide no health insurance or other benefits, etc.
7 replies →
Indeed, and I don't know why people keep saying that we ever thought the 20 hour workweek was feasible, because there is always more work to be done. Work expands to fill the constraints available, similar to Parkinson's Law.
1 reply →
This misses a huge part of the story, increase in productive, means a large economy, means more efficient use of resources, means compensation goes up over time. If you want to live the live of somebody that did 40h a week 40 years ago and only work 20h, you can already have most of that, and still have many options somebody back then didn't have that is virtually free.
The actual realization is that most people simple rather work 40h a week (or more) and spend their money on whatever they want to spend their money on.
Specially many of us here, can do so easily. I personally work 80% and could reduce it further if my goal was maximum leisure.
By far the biggest reason it doesn't feel that way, is that housing polices in most of the Western world have been utterly and completely braindead. That and the ever increasing cost of health care as people get ever older and older.
You're on the right track, but missing an important aspect.
In most cases the company making the inferior product didn't spend less. But they did spend differently. As in, they spent a lot on marketing.
You were focused on quality, and hoped for viral word of mouth marketing. Your competitors spent the same as you, but half their budget went to marketing. Since people buy what they know, they won.
Back in the day MS made Windows 95. IBM made OS/2. MS spend a billion $ on marketing Windows 95. That's a billion back when a billion was a lot. Just for the launch.
Techies think that Quality leads to sales. If does not. Marketing leads to sales. There literally is no secret to business success other than internalizing that fact.
Quality can lead to sales - this was the premise behind the original Google (they never spent a dime on advertising their own product until the Parisian Love commercial [1] came out in 2009, a decade after founding), and a few other tech-heavy startups like Netscape or Stripe. Microsoft certainly didn't spend a billion $ marketing Altair Basic.
The key point to understand is the only effort that matters is that which makes the sale. Business is a series of transactions, and each individual transaction is binary: it either happens or it doesn't. Sometimes, you can make the sale by having a product which is so much better than alternatives that it's a complete no-brainer to use it, and then makes people so excited that they tell all their friends. Sometimes you make the sale by reaching out seven times to a prospect that's initially cold but warms up in the face of your persistence. Sometimes, you make the sale by associating your product with other experiences that your customers want to have, like showing a pretty woman drinking your beer on a beach. Sometimes, you make the sale by offering your product 80% off to people who will switch from competitors and then jacking up the price once they've become dependent on it.
You should know which category your product fits into, and how and why customers will buy it, because that's the only way you can make smart decisions about how to allocate your resources. Investing in engineering quality is pointless if there is no headroom to deliver experiences that will make a customer say "Wow, I need to have that." But if you are sitting on one of those gold mines, capitalizing on it effectively is orders of magnitude more efficient than trying to market a product that doesn't really work.
[1] https://www.youtube.com/watch?v=nnsSUqgkDwU
6 replies →
It's not just software -- My wife owns a restaurant. Operating a restaurant you quickly learn the sad fact that quality is just not that important to your success.
We're still trying to figure out the marketing. I'm convinced the high failure rate of restaurants is due largely to founders who know how to make good food and think their culinary skills plus word-of-mouth will get them sales.
10 replies →
Pure marketing doesn’t always win. There are counter examples.
Famously Toyota beat many companies that were basing their strategy on marketing rather than quality.
They were able to use quality as part of their marketing.
My father in law worked in a car showroom and talks about when they first installed carpet there.
No one did that previously. The subtle point to customers being that Toyotas didn’t leak oil.
IIRC, Microsoft was also charging Dell for a copy of Windows even if they didn't install it on the PC! And yeah OS/2 was ahead by miles.
2 replies →
This is a massive oversimplification of the Windows and OS/2 story. Anybody that has studied this understands that it wasn't just marketing. I can't actually believe that anybody who has read deeply about this believes it was just marketing.
And its also a cherry picked example. There are so many counter-examples, how Sun out-competed HP, IBM, Appollo and DEC. Or how AMD in the last 10 years out-competed Intel, sure its all marketing. I could go on with 100s of examples just in computer history.
Marketing is clearly an important aspect in business, nobody denies that. But there are many other things that are important as well. You can have the best marketing in the world, if you fuck up your production and your supply chain, your company is toast. You can have the best marketing in the world, if your product sucks, people will reject it (see the Blackbarry Strom as an nice example). You can have the best marketing in the world, if your finance people fuck up, the company might go to shit anyway.
Anybody that reaches for simple explanations like 'marketing always wins' is just talking nonsense.
> What I realized is that lower costs, and therefore lower quality,
This implication is the big question mark. It's often true but it's not at all clear that it's necessarily true. Choosing better languages, frameworks, tools and so on can all help with lowering costs without necessarily lowering quality. I don't think we're anywhere near the bottom of the cost barrel either.
I think the problem is focusing on improving the quality of the end products directly when the quality of the end product for a given cost is downstream of the quality of our tools. We need much better tools.
For instance, why are our languages still obsessed with manipulating pointers and references as a primary mode of operation, just so we can program yet another linked list? Why can't you declare something as a "Set with O(1) insert" and the language or its runtime chooses an implementation? Why isn't direct relational programming more common? I'm not talking programming in verbose SQL, but something more modern with type inference and proper composition, more like LINQ, eg. why can't I do:
These abstract over implementation details that we're constantly fiddling with in our end programs, often for little real benefit. Studies have repeatedly shown that humans can write less than 20 lines of correct code per day, so each of those lines should be as expressive and powerful as possible to drive down costs without sacrificing quality.
You can do this in Scala[0], and you'll get type inference and compile time type checking, informational messages (like the compiler prints an INFO message showing the SQL query that it generates), and optional schema checking against a database for the queries your app will run. e.g.
This integrates with a high-performance functional programming framework/library that has a bunch of other stuff like concurrent data structures, streams, an async runtime, and a webserver[1][2]. The tools already exist. People just need to use them.
[0] https://github.com/zio/zio-protoquill?tab=readme-ov-file#sha...
[1] https://github.com/zio
[2] https://github.com/zio/zio-http
3 replies →
Hm, you could do that quite easily but there isn't much juice to be squeezed from runtime selected data structures. Set with O(1) insert:
Done. Don't need any fancy support for that. Or if you want to load from a database, using the repository pattern and Kotlin this time instead of Java:
That would turn into an efficient SQL query that does a WHERE ... AND ... clause. But you can also compose queries in a type safe way client side using something like jOOQ or Criteria API.
7 replies →
Your argument makes sense. I guess now it's your time to shine and to be the change you want to see in the world.
3 replies →
Isn't this comprehension in Python https://www.w3schools.com/python/python_lists_comprehension.... ?
Clojure, friend. Clojure.
Other functional languages, too, but Clojure. You get exactly this, minus all the <'s =>'s ;'s and other irregularities, and minus all the verbosity...
I consider functional thinking and ability to use list comprehensions/LINQ/lodash/etc. to be fundamental skills in today's software world. The what, not the how!
5 replies →
> I don’t think this leads to market collapse
You must have read that the Market for Lemons is a type of market failure or collapse. Market failure (in macroeconomics) does not yet mean collapse. It describes a failure to allocate resources in the market such that the overall welfare of the market participants decreases. With this decrease may come a reduction in trade volume. When the trade volume decreases significantly, we call it a market collapse. Usually, some segment of the market that existed ceases to exist (example in a moment).
There is a demand for inferior goods and services, and a demand for superior goods. The demand for superior goods generally increases as the buyer becomes wealthier, and the demand for inferior goods generally increases as the buyer becomes less wealthy.
In this case, wealthier buyers cannot buy the superior relevant software previously available, even if they create demand for it. Therefore, we would say a market fault has developed as the market could not organize resources to meet this demand. Then, the volume of high-quality software sales drops dramatically. That market segment collapses, so you are describing a market collapse.
> There’s probably another name for this
You might be thinking about "regression to normal profits" or a "race to the bottom." The Market for Lemons is an adjacent scenario to both, where a collapse develops due to asymmetric information in the seller's favor. One note about macroecon — there's never just one market force or phenomenon affecting any real situation. It's always a mix of some established and obscure theories.
The Wikipedia page for Market for Lemons more or less summarizes it as a condition of defective products caused by information asymmetry, which can lead to adverse selection, which can lead to market collapse.
https://en.m.wikipedia.org/wiki/The_Market_for_Lemons
The Market for Lemons idea seems like it has merit in general but is too strong and too binary to apply broadly, that’s where I was headed with the suggestion for another name. It’s not that people want low quality. Nobody actually wants defective products. People are just price sensitive, and often don’t know what high quality is or how to find it (or how to price it), so obviously market forces will find a balance somewhere. And that balance is extremely likely to be lower on the quality scale than what people who care about high quality prefer. This is why I think you’re right about the software market tolerating low quality; it’s because market forces push everything toward low quality.
11 replies →
My wife has a perfume business. She makes really high quality extrait de parfums [1] with expensive materials and great formulations. But the market is flooded with eau de parfums -- which are far more diluted than a extrait -- using cheaper ingredients, selling for about the same price. We've had so many conversations about whether she should dilute everything like the other companies do, but you lose so much of the beauty of the fragrance when you do that. She really doesn't want to go the route of mediocrity, but that does seem to be what the market demands.
[1] https://studiotanais.com/
> [1] https://studiotanais.com/
First, honest impression: At least on my phone (Android/Chromium) the typography and style of the website don't quite match that "high quality & expensive ingredients" vibe the parfums are supposed to convey. The banners (3 at once on the very first screen, one of them animated!), italic text, varying font sizes, and janky video header would be rather off-putting to me. Maybe it's also because I'm not a huge fan of flat designs, partially because I find they make it difficult to visually distinguish important and less important information, but also because I find them a bit… unrefined and inelegant. And, again, this is on mobile, so maybe on desktop it comes across differently.
Disclaimer: I'm not a designer (so please don't listen only to me and take everything with a grain of salt) but I did work as a frontend engineer for a luxury retailer for some time.
5 replies →
To he blunt
this website looks like a scam website redirecter the one where you have to click on 49 ads and wait for 3 days before you get to your link the video playing immediately makes me think that's a Google ad unrelated to what the website is about the different font styles reminds me of the middle school HTML projects we had to do with each line in a different size and font face to prove that we know how to use <font face> and <font size>. All its missing is a jokerman font
She should double the price so customers wonder why hers costs so much more. Then have a sales pitch explaining the difference.
Some customers WANT to pay a premium just so they know they’re getting the best product.
Offer an eau de parfum line for price anchoring, and market segmentation. Win win.
1 reply →
Is that what the market demands, or is the market unable to differentiate?
From the site there's a huge assumption that potential customers are aware of what extrait de parfum is vs eau de parfum (or even eau de toilette!).
Might be worth a call out that these fragrances are in fact a standard above the norm.
"The highest quality fragrance money can buy" kind of thing.
> But the market is flooded with eau de parfums -- which are far more diluted than a extrait -- using cheaper ingredients, selling for about the same price.
Has she tried raising prices? To signal that her product is highly quality and thus more expensive than her competition?
6 replies →
I had the same realization but with car mechanics. If you drive a beater you want to spend the least possible on maintenance. On the other hand, if the car mechanic cares about cars and their craftsmanship they want to get everything to tip-top shape at high cost. Some other mechanics are trying to scam you and get the most amount of money for the least amount of work. And most people looking for car mechanics want to pay the least amount possible, and don't quite understand if a repair should be expensive or not. This creates a downward pressure on price at the expense of quality and penalizes the mechanics that care about quality.
Luckily for mechanics, the shortage of actual blue collar Hands-On labor is so small, that good mechanics actually can charge more.
The issue is that you have to be able to distinguish a good mechanic from a bad mechanic cuz they all get to charge a lot because of the shortage. Same thing for plumbing, electrical, HVAC, etc etc etc
But I understand your point.
6 replies →
Exactly. People on HN get angry and confused about low software quality, compute wastefulness, etc, but what's happening is not a moral crisis: the market has simply chosen the trade-off it wants, and industry has adapted to it
If you want to be rewarded for working on quality, you have to find a niche where quality has high economic value. If you want to put effort into quality regardless, that's a very noble thing and many of us take pleasure in doing so, but we shouldn't act surprised when we aren't economically rewarded for it
I actually disagree. I think that people will pay more for higher quality software, but only if they know the software is higher quality.
It's great to say your software is higher quality, but the question I have is whether or not is is higher quality with the same or similar features, and second, whether the better quality is known to the customers.
It's the same way that I will pay hundreds of dollars for Jetbrains tools each year even though ostensibly VS Code has most of the same features, but the quality of the implementation greatly differs.
If a new company made their IDE better than jetbrains though, it'd be hard to get me to fork over money. Free trials and so on can help spread awareness.
The Lemon Market exists specifically when customers cannot tell, prior to receipt and usage, whether they are buying high quality or low quality.
5 replies →
> but only if they know the software is higher quality.
I assume all software is shit in some fashion because every single software license includes a clause that has "no fitness for any particular purpose" clause. Meaning, if your word processor doesn't process words, you can't sue them.
When we get consumer protection laws that require that software does what is says on the tin quality will start mattering.
It can depend on the application/niche.
I used to write signal processing software for land mobile radios. Those radios were used by emergency services. For the most part, our software was high quality in that it gave good quality audio and rarely had problems. If it did have a problem, it would recover quickly enough that the customer would not notice.
Our radios got a name for reliability: such as feedback from customers about skyscrapers in New York being on fire and the radios not skipping a beat during the emergency response. Word of mouth traveled in a relatively close knit community and the "quality" did win customers.
Oddly we didn't have explicit procedures to maintain that quality. The key in my mind was that we had enough time in the day to address the root cause of bugs, it was a small enough team that we knew what was going into the repository and its effect on the system, and we developed incrementally. A few years later, we got spread thinner onto more products and it didn't work so well.
Which skyscrapers (plural)? Fires in NYC high-rise buildings are incredibly rare in the last 20 years.
1 reply →
I kind of see this in action when I'm comparing products on Amazon. When comparing two products on Amazon that are substantially the same, the cheaper one will have way more reviews. I guess this implies that it has captured the majority of the market.
I think this honestly has more to do with moslty Chinese sellers engaging in review fraud, which is a rampant problem. I'm not saying non-Chinese sellers don't engage in review fraud, but I have noticed a trend that around 98% of fake or fraudulently advertised products are of Chinese origin.
If it was just because it was cheap, we'd also see similar fraud from Mexican or Vietnamese sellers, but I don't really see that.
3 replies →
Luxury items however seem to buck this trend, but this is all about conspicuous consumption.
There's an analogy with evolution. In that case, what survives might be the fittest, but it's not the fittest possible. It's the least fit that can possibly win. Anything else represents an energy expenditure that something else can avoid, and thus outcompete.
I had the exact same experience trying to build a startup. The thing that always puzzled me was Apple: they've grown into one of the most profitable companies in the world on the basis of high-quality stuff. How did they pull it off?
They focused heavily on the quality of things you can see, i.e. slick visuals, high build quality, even fancy cardboard boxes.
Their software quality itself is about average for the tech industry. It's not bad, but not amazing either. It's sufficient for the task and better than their primary competitor (Windows). But, their UI quality is much higher, and that's what people can check quickly with their own eyes and fingers in a shop.
1 reply →
"Market comes first, marketing second, aesthetic third, and functionality a distant fourth" ― Rob Walling in "Start Small, Stay Small"
Apple's aesthetic is more important than the quality (which has been deteriorating lately)
Not on Macintosh. On iPod, iPhone and iPad.
All of those were marketed as just-barely-affordable consumer luxury goods. The physical design and the marketing were more important than the specs.
By being a luxury consumer company. There is no luxury (quality) enterprise software. There is lock-in-extortion enterprise software.
Apple's supposed high quality is mostly marketing.
They have constant, frequent, hardware design issues that they just don't even acknowledge and somehow people still treat their hardware as "high quality"
They once shipped a phone that lost signal if you held it with your hand. Their solution, after insisting that people hold their phone differently, was cheap plastic cases.
They shipped a new keyboard that would fail after singular grains of dust got into it, in order to save a millimeter of thickness on a product that was already quite thin. In order to repair or replace the keyboard, you have to replace half of the whole machine, for half the price of a brand new laptop.
Apple does not spend real effort on hardware quality.
This is a really succinct analysis, thanks.
I'm thinking out loud but it seems like there's some other factors at play. There's a lower threshold of quality that needs to happen (the thing needs to work) so there's at least two big factors, functionality and cost. In the extreme, all other things being equal, if two products were presented at the exact same cost but one was of superior quality, the expectation is that the better quality item would win.
There's always the "good, fast, cheap" triangle but with Moore's law (or Wright's law), cheap things get cheaper, things iterate faster and good things get better. Maybe there's an argument that when something provides an order of magnitude quality difference at nominal price difference, that's when disruption happens?
So, if the environment remains stable, then mediocrity wins as the price of superior quality can't justify the added expense. If the environment is growing (exponentially) then, at any given snapshot, mediocrity might win but will eventually be usurped by quality when the price to produce it drops below a critical threshold.
You're laying it out like it's universal, in my experience there are products where people will seek for the cheapest good enough but there are also other product that people know they want quality and are willing to pay more.
Take cars for instance, if all people wanted the cheapest one then Mercedes or even Volkswagen would be out of business.
Same for professional tools and products, you save more by buying quality product.
And then, even in computer and technology. Apple iPhone aren't cheap at all, MacBook come with soldered ram and storage, high price, yet a big part of people are willing to buy that instead of the usual windows bloated spyware laptop that run well enough and is cheap.
> the cheapest one then Mercedes or even Volkswagen would be out of business
I would argue this is a bad example - most luxury cars aren't really meaningfully "better", they just have status symbol value. A mid range Honda civic or Toyota corolla is not "worse" than a Mercedes for most objective measurements.
2 replies →
Not everyone wants the cheapest, but lemons fail and collapse the expensive part of the market with superior goods.
To borrow your example, it's as if Mercedes started giving every 4th customer a Lada instead (after the papers are signed). The expensive Mercedes market would quickly no longer meet the luxury demand of wealthy buyers and collapse. Not the least because Mercedes would start showing super-normal profits, and all other luxury brands would get in on the same business model. It's a race to the bottom. When one seller decreases the quality, so must others. Otherwise, they'll soon be bought out, and that's the best-case scenario compared to being outcompeted.
There is some evidence that the expensive software market has collapsed. In the 00s and 90s, we used to have expensive and cheap video games, expensive and cheap video editing software, and expensive and cheap office suites. Now, we have homogeneous software in every niche — similar features and similar (relatively cheap) prices. AAA game companies attempting to raise their prices back to 90s levels (which would make a AAA game $170+ in today's money) simply cannot operate in the expensive software market. First, there was consumer distrust due to broken software, then there were no more consumers in that expensive-end market segment.
Hardware you mention (iPhones, Androids, Macs, PCs) still have superior and inferior hardware options. Both ends of the market exist. The same applies to most consumer goods - groceries, clothes, shoes, jewelry, cars, fuel, etc. However, for software, the top end of the market is now non-existent. It's gone the way of expensive secondary market (resale) cars, thanks to how those with hidden defects undercut their price and destroyed consumer trust.
1 reply →
If you’re trying to sell a product to the masses, you either need to make it cheap or a fad.
You cannot make a cheap product with high margins and get away with it. Motorola tried with the RAZR. They had about five or six good quarters from it and then within three years of initial launch were hemorrhaging over a billion dollars a year.
You have to make premium products if you want high margins. And premium means you’re going for 10% market share, not dominant market share. And if you guess wrong and a recession happens, you might be fucked.
Yes, I was in this place too when I had a consulting company. We bid on projects with quotes for high quality work and guaranteed delivery within the agreed timeframe. More often than not we got rejected in favor of some students who submitted a quote for 4x less. I sometimes asked those clients how the project went, and they'd say, well, those guys missed the deadline and asked for more money several times
> We were sure that a better product would win people over and lead to viral success. It didn’t. Things grew, but so slowly that we ran out of money after a few years before reaching break even.
Relevant apocrypha: https://www.youtube.com/watch?v=UFcb-XF1RPQ
These economic forces exist in math too. Almost every mathematician publishes informal proofs. These contain just enough discussion in English (or other human language) to convince a few other mathematicians in the same field that they their idea is valid. But it is possible to make errors. There are other techniques: formal step-by-step proof presentations (e.g. by Leslie Lamport) or computer-checked proofs that would be more reliable. But almost no mathematician uses these.
I feel your realization and still hope my startup will have an competitive edge through quality.
In this case quality also means code quality which in my coding believe should lead to faster feature development
The problem with your thesis is that software isn't a physical good, so quality isn't tangible. If software does the advertised thing, it's good software. That's it.
With physical items, quality prevents deterioration over time. Or at least slows it. Improves function. That sort of thing.
Software just works or doesn't work. So you want to make something that works and iterate as quickly as possible. And yes, cost to produce it matters so you can actually bring it to market.
I'm a layman, but in my opinion building quality software can't really be a differentiator because anyone can build quality software given enough time and resources. You could take two car mechanics and with enough training, time, assistance from professional dev consultants, testing, rework, so and so forth, make a quality piece of software. But you'd have spent $6 million to make a quality alarm clock app.
A differentiator would be having the ability to have a higher than average quality per cost. Then maybe you're onto something.
It depends on who is paying versus using the product. If the buyer is the user, they tend value quality more so than otherwise.
Do you drive the cheapest car, eat the cheapest food, wear the cheapest clothes, etc.?
I see another dynamic "customer value" features get prioritized and eventually product reaches a point of crushing tech debt. It results in "customer value" features delivery velocity grinding to a halt. Obviously subject to other forces but it is not infrequent for someone to come in and disrupt the incumbents at this point.
> People want cheap
There is an exception: luxury goods. Some are expensive, but people don't mind them being overpriced because e.g. they are social status symbols. Is there such a thing like "luxury software"? I think Apple sort of has this reputation.
>lower costs, and therefore lower quality,
Many high-quality open-source designs suggest this is a false premise, and as a developer who writes high-quality and reliable software for much much lower rates than most, cost should not be seen as a reliable indicator of quality.
"Quality is free"[1], luxury isn't.
Also, one should not confuse the quality of the final product and the quality of the process.
[1] https://archive.org/details/qualityisfree00cros
There’s probably another name for this
Capitalism? Marx's core belief was that capitalists would always lean towards paying the absolute lowest price they could for labor and raw materials that would allow them to stay in production. If there's more profit in manufacturing mediocrity at scale than quality at a smaller scale, mediocrity it is.
Not all commerce is capitalistic. If a commercial venture is dedicated to quality, or maximizing value for its customers, or the wellbeing of its employees, then it's not solely driven by the goal of maximizing capital. This is easier for a private than a public company, in part because of a misplaced belief that maximizing shareholder return is the only legally valid business objective. I think it's the corporate equivalent of diabetes.
In the 50s and 60s, capitalism used to refer to stakeholder capitalism. It was dedicated to maximize value for stakeholders, such as customers, employees, society, etc.
But that shifted later, with Milton Friedman, who pushed the idea of shareholder capitalism in the 70s. Where companies switched to thinking the only goal is to maximize shareholder value.
In his theory, government would provide regulation and policies to address stakeholder's needs, and companies therefore needed focus on shareholders.
In practice, lobbying, propaganda and corruption made it so governments dropped the ball and also sided to maximize shareholder value, along with companies.
But do you think you could have started with a bug laden mess? Or is it just the natural progression down the quality and price curve that comes with scale
> People want cheap, so if you sell something people want, someone will make it for less by cutting “costs” (quality).
Sure, but what about the people who consider quality as part of their product evaluation? All else being equal everyone wants it cheaper, but all else isn't equal. When I was looking at smart lighting, I spent 3x as much on Philips Hue as I could have on Ikea bulbs: bought one Ikea bulb, tried it on next to a Hue one, and instantly returned the Ikea one. It was just that much worse. I'd happily pay similar premiums for most consumer products.
But companies keep enshittifying their products. I'm not going to pay significantly more for a product which is going to break after 16 months instead of 12 months. I'm not going to pay extra for some crappy AI cloud blockchain "feature". I'm not going to pay extra to have a gaudy "luxury" brand logo stapled all over it.
Companies are only interested in short-term shareholder value these days, which means selling absolute crap at premium prices. I want to pay extra to get a decent product, but more and more it turns out that I can't.
>There’s probably another name for this, it’s not quite the Market for Lemons idea. I don’t think this leads to market collapse, I think it just leads to stable mediocrity everywhere, and that’s what we have.
It's the same concept as the age old "only an engineer can build a bridge that just barely doesn't fall down" circle jerk but for a more diverse set of goods than just bridges.
Maybe you could compete by developing new and better products? Ford isn't selling the same car with lower and lower costs every year.
It's really hard to reconcile your comment with Silicon Valley, which was built by often expensive innovation, not by cutting costs. Were Apple, Meta, Alphabet, Microsoft successful because they cut costs? The AI companies?
Microsoft yes, the PC market made it very hard for Apple to compete on price.
Meta and Alphabet had zero cost products (to consumers) that they leveraged to become near monopolies.
Aren’t all the AI companies believed to be providing their products below cost for now to grab market share?
1 reply →
I'm proud of you, it often takes people multiple failures before they learn to accept their worldview that regulations aren't necessary and the tragedy of Commons is a myth are wrong.
I’d argue this exists for public companies, but there are many smaller, private businesses where there’s no doctrine of maximising shareholder value
These companies often place a greater emphasis on reputation and legacy Very few and far between, Robert McNeel & Associates (American) is one that comes to mind (Rhino3D), as his the Dutch company Victron (power hardware)
The former especially is not known for maximising their margins, they don’t even offer a subscription-model to their customers
Victron is an interesting case, where they deliberately offer few products, and instead of releasing more, they heavily optimise and update their existing models over many years in everything from documentation to firmware and even new features. They’re a hardware company mostly so very little revenue is from subscriptions
> There’s probably another name for this
Race to the bottom
> the market sells as if all goods were high-quality
The phrase "high-quality" is doing work here. The implication I'm reading is that poor performance = low quality. However, the applications people are mentioning in this comment section as low performance (Teams, Slack, Jira, etc) all have competitors with much better performance. But if I ask a person to pick between Slack and, say, a a fast IRC client like Weechat... what do you think the average person is going to consider low-quality? It's the one with a terminal-style UI, no video chat, no webhook integrations, and no custom avatars or emojis.
Performance is a feature like everything else. Sometimes, it's a really important feature; the dominance of Internet Explorer was destroyed by Chrome largely because it was so much faster than IE when it was released, and Python devs are quickly migrating to uv/ruff due to the performance improvement. But when you start getting into the territory of "it takes Slack 5 seconds to start up instead of 10ms", you're getting into the realm where very few people care.
You are comparing applications with wildly different features and UI. That's neither an argument for nor against performance as an important quality metric.
How fast you can compile, start and execute some particular code matters. The experience of using a program that performs well if you use it daily matters.
Performance is not just a quantitative issue. It leaks into everything, from architecture to delivery to user experience. Bad performance has expensive secondary effects, because we introduce complexity to patch over it like horizontal scaling, caching or eventual consistency. It limits our ability to make things immediately responsive and reliable at the same time.
> You are comparing applications with wildly different features and UI. That's neither an argument for nor against performance as an important quality metric.
Disagree, the main reason so many apps are using "slow" languages/frameworks is precisely that it allows them to develop way more features way quicker than more efficient and harder languages/frameworks.
> You are comparing applications with wildly different features and UI. That's neither an argument for nor against performance as an important quality metric.
I never said performance wasn't an important quality metric, just that it's not the only quality metric. If a slow program has the features I need and a fast program doesn't, the slow program is going to be "higher quality" in my mind.
> How fast you can compile, start and execute some particular code matters. The experience of using a program that performs well if you use it daily matters.
Like any other feature, whether or not performance is important depends on the user and context. Chrome being faster than IE8 at general browsing (rendering pages, opening tabs) was very noticeable. uv/ruff being faster than pip/poetry is important because of how the tools integrate into performance-sensitive development workflows. Does Slack taking 5-10 seconds to load on startup matter? -- to me not really, because I have it come up on boot and forget about it until my next system update forced reboot. Do I use LibreOffice or Word and Excel, even though LibreOffice is faster? -- I use Word/Excel because I've run into annoying compatibility issues enough times with LO to not bother. LibreOffice could reduce their startup and file load times to 10 picoseconds and I would still use MS Office, because I just want my damn documents to keep the same formatting my colleagues using MS Office set on their Windows computers.
Now of course I would love the best of all worlds; programs to be fast and have all the functionality I want! In reality, though, companies can't afford to build every feature, performance included, and need to pick and choose what's important.
6 replies →
If you're being honest, compare Slack and Teams not with weechat, but with Telegram. Its desktop client (along with other clients) is written by an actually competent team that cares about performance, and it shows. They have enough money to produce a native client written in C++ that has fantastic performance and is high quality overall, but these software behemoths with budgets higher than most countries' GDP somehow never do.
This; "quality" is such an unclear term here.
In an efficient market people buy things based on a value which in the case of software, is derived from overall fitness for use. "Quality" as a raw performance metric or a bug count metric aren't relevant; the criteria is "how much money does using this product make or save me versus its competition or not using it."
In some cases there's a Market of Lemons / contract / scam / lack of market transparency issue (ie - companies selling defective software with arbitrary lock-ins and long contracts), but overall the slower or more "defective" software is often more fit for purpose than that provided by the competition. If you _must_ have a feature that only a slow piece of software provides, it's still a better deal to acquire that software than to not. Likewise, if software is "janky" and contains minor bugs that don't affect the end results it provides, it will outcompete an alternative which can't produce the same results.
That's true. I meant it in a broader sense. Quality = {speed, function, lack of bugs, ergonomics, ... }.
[dead]
I don't think it's necessarily a market for lemons. That involves information asymmetry.
Sometimes that happens with buggy software, but I think in general, people just want to pay less and don't mind a few bugs in the process. Compare and contrast what you'd have to charge to do a very thorough process with multiple engineers checking every line of code and many hours of rigorous QA.
I once did some software for a small book shop where I lived in Padova, and created it pretty quickly and didn't charge the guy - a friend - much. It wasn't perfect, but I fixed any problems (and there weren't many) as they came up and he was happy with the arrangement. He was patient because he knew he was getting a good deal.
I do think there is an information problem in many cases.
It is easy to get information of features. It is hard to get information on reliability or security.
The result is worsened because vendors compete on features, therefore they all make the same trade off of more features for lower quality.
Some vendors even make it impossible to get information. See Oracle and Microsoft forbidding publishing benchmarks for their SQL databases.
There's likely some, although it depends on the environment. The more users of the system there are, the more there are going to be reviews and people will know that it's kind of buggy. Most people seem more interested in cost or features though, as long as they're not losing hours of work due to bugs.
I have worked for large corporations that have foisted awful HR, expense reporting, time tracking and insurance "portals" that were so awful I had to wonder if anyone writing the checks had ever seen the product. I brought up the point several times that if my team tried to tell a customer that we had their project all done but it was full of as many bugs and UI nightmares as these back office platforms, I would be chastised, demoted and/or fired.
I used to work at a large company that had a lousy internal system for doing performance evals and self-reviews. The UI was shitty, it was unreliable, it was hard to use, it had security problems, it would go down on the eve of reviews being due, etc. This all stressed me out until someone in management observed, rather pointedly, that the reason for existence of this system is that we are contractually required to have such a system because the rules for government contracts mandate it, and that there was a possibility (and he emphasized the word possibility knowingly) that the managers actully are considering their personal knowledge of your performance rather than this performative documentation when they consider your promotions and comp adjustments. It was like being hit with a zen lightning bolt: this software meets its requirements exactly, and I can stop worrying about it. From that day on I only did the most cursory self-evals and minimal accomplishents, and my career progressed just fine.
You might not think about this as “quality” but it does have the quality of meeting the perverse functional requirements of the situation.
> I had to wonder if anyone writing the checks had ever seen the product
Probably not, and that's like 90% of the issue with enterprise software. Sadly enterprise software products are often sold based mainly on how many boxes they check in the list of features sent to management, not based on the actual quality and usability of the product itself.
What you're describing is Enterprise(tm) software. Some consultancy made tens of millions of dollars building, integrating, and deploying those things. This of course was after they made tens of millions of dollars producing reports exploring how they would build, integrate, and deploy these things and all the various "phases" involved. Then they farmed all the work out to cheap coders overseas and everyone went for golf.
Meanwhile I'm a founder of startup that has gotten from zero to where it is on probably what that consultancy spends every year on catering for meetings.
If they think it is unimportant talk as if it is. It could be more polished. Do we want to impress them or just satisfy their needs?
The job it’s paid to do is satisfy regulation requirements.
Across three jobs, I have now seen three different HR systems from the same supplier which were all differently terrible.
> the market buys bug-filled, inefficient software about as well as it buys pristine software
In fact, the realization is that the market buy support.
And that includes google and other companies that lack much of human support.
This is the key.
Support is manifested in many ways:
* There is information about it (docs, videos, blogs, ...)
* There is people that help me ('look ma, this is how you use google')
* There is support for the thing I use ('OS, Browser, Formats, ...')
* And for my way of working ('Excel let me do any app there...')
* And finally, actual people (that is the #1 thing that keep alive even the worst ERP on earth). This also includes marketing, sales people, etc. This are signal of having support even if is not exactly the best. If I go to enterprise and only have engineers that will be a bad signal, because well, developers then to be terrible at other stuff and the other stuff is support that matters.
If you have a good product, but there is not support, is dead.
And if you wanna fight a worse product, is smart to reduce the need to support for ('bugs, performance issues, platforms, ...') for YOUR TEAM because you wanna reduce YOUR COSTS but you NEED to add support in other dimensions!
The easiest for a small team, is just add humans (that is the MOST scarce source of support). After that, it need to be creative.
(also, this means you need to communicate your advantages well, because there is people that value some kind of support more than others 'have the code vs propietary' is a good example. A lot prefer the proprietary with support more than the code, I mean)
So you're telling me that if companies want to optimize profitability, they’d release inefficient, bug-ridden software with bad UI—forcing customers to pay for support, extended help, and bug fixes?
Suddenly, everything in this crazy world is starting to make sense.
Afaik, SAS does exactly that (haven't any experience with them personally, just retelling gossips). Also Matlab. Not that they are BAD, it's just that 95% of matlab code could be python or even fortran with less effort. But matlab have really good support (aka telling the people in charge how they are tailored to solve this exact problem).
Suddenly, Microsoft makes perfect sense!
This really focuses on the single metric that can be used try ought lifetime of a product … a really good point that keeps unfolding.
Starting an OSS product - write good docs. Got a few enterprise people interested - “customer success person” is most important marketing you can do …
Even if end-users had the data to reasonably tie-break on software quality and performance, as I scroll my list of open applications not a single one of them can be swapped out with another just because it were more performant.
For example: Docker, iterm2, WhatsApp, Notes.app, Postico, Cursor, Calibre.
I'm using all of these for specific reasons, not for reasons so trivial that I can just use the best-performing solution in each niche.
So it seems obviously true that it's more important that software exists to fill my needs in the first place than it pass some performance bar.
I’m surprised in your list because it contains 3 apps that I’ve replaced specifically due to performance issues (docker, iterm and notes). I don’t consider myself particularly performance sensitive (at home) either. So it might be true that the world is even _less_ likely to pay for resource efficiency than we think.
What did you replace Docker with?
1 reply →
Podman might have some limited API compatibility, but it's a completely different tool. Just off the bat it's not compatible with Skaffold, apparently.
That an alternate tool might perform better is compatible with the claim that performance alone is never the only difference between software.
Podman might be faster than Docker, but since it's a different tool, migrating to it would involve figuring out any number of breakage in my toolchain that doesn't feel worth it to me since performance isn't the only thing that matters.
Except you’ve already swapped terminal for iterm, and orbstack already exists in part because docker left so much room for improvement, especially on the perf front.
I swapped Terminal for iTerm2 because I wanted specific features, not because of performance. iTerm2 is probably slower for all I care.
Another example is that I use oh-my-zsh which is adds weirdly long startup time to a shell session, but it lets me use plugins that add things like git status and kubectl context to my prompt instead of fiddling with that myself.
> But IC1-3s write 99% of software, and the 1 QA guy in 99% of tech companies
I'd take this one step further, 99% of the software written isn't being done with performance in mind. Even here in HN, you'll find people that advocate for poor performance because even considering performance has become a faux pas.
That means you L4/5 and beyond engineers are fairly unlikely to have any sort of sense when it comes to performance. Businesses do not prioritize efficient software until their current hardware is incapable of running their current software (and even then, they'll prefer to buy more hardware is possible.)
The user tolerance has changed as well because the web 2.0 "perpetual beta" and SaaS replacing other distribution models.
Also Microsoft has educated now several generations to accept that software fails and crashes.
Because "all software is the same", customers may not appreciate good software when they're used to live with bad software.
Is this really tolerance and not just monopolistic companies abusing their market position? I mean workers can't even choose what software they're allowed to use, those choices are made by the executive/management class.
> The buyer cannot differentiate between high and low-quality goods before buying, so the demand for high and low-quality goods is artificially even. The cause is asymmetric information.
That's where FOSS or even proprietary "shared source" wins. You know if the software you depend on is generally badly or generally well programmed. You may not be able to find the bugs, but you can see how long the functions are, the comments, and how things are named. YMMV, but conscientiousness is a pretty great signal of quality; you're at least confident that their code is clean enough that they can find the bugs.
Basically the opposite of the feeling I get when I look at the db schemas of proprietary stuff that we've paid an enormous amount for.
IME, the problem is that FOSS consumer facing software is just about the worst in UX and design.
Technically correct, since you know it's bad because it's FOSS.
At least when talking about software that has any real world use case, and not development for developments sake.
The used car market is market for lemons because it is difficult to distinguish between a car that has been well maintained and a car close to breaking down. However, the new car market is decidedly not a market for lemons because every car sold is tested by the state, and reviewed by magazines and such. You know exactly what you are buying.
Software is always sold new. Software can increase in quality the same way cars have generally increased in quality over the decades. Creating standards that software must meet before it can be sold. Recalling software that has serious bugs in it. Punishing companies that knowingly sell shoddy software. This is not some deep insight. This is how every other industry operates.
A hallmark of well-designed and well-written software is that it is easy to replace, where bug-ridden spaghetti-bowl monoliths stick around forever because nobody wants to touch them.
Just through pure Darwinism, bad software dominates the population :)
That's sorta the premise of the tweet, though.
Right now, the market buys bug-filled, inefficient software because you can always count on being able to buy hardware that is good enough to run it. The software expands to fill the processing specs of the machine it is running on - "What Andy giveth, Bill taketh away" [1]. So there is no economic incentive to produce leaner, higher-quality software that does only the core functionality and does it well.
But imagine a world where you suddenly cannot get top-of-the-line chips anymore. Maybe China invaded Taiwan and blockaded the whole island, or WW3 broke out and all the modern fabs were bombed, or the POTUS instituted 500% tariffs on all electronics. Regardless of cause, you're now reduced to salvaging microchips from key fobs and toaster ovens and pregnancy tests [2] to fulfill your computing needs. In this world, there is quite a lot of economic value to being able to write tight, resource-constrained software, because the bloated stuff simply won't run anymore.
Carmack is saying that in this scenario, we would be fine (after an initial period of adjustment), because there is enough headroom in optimizing our existing software that we can make things work on orders-of-magnitude less powerful chips.
[1] https://en.wikipedia.org/wiki/Andy_and_Bill%27s_law
[2] https://www.popularmechanics.com/science/a33957256/this-prog...
I have that washing machine btw. I saw the AI branding and had a chuckle. I bought it anyway because it was reasonably priced (the washer was $750 at Costco).
In my case I bought it because LG makes appliances that fit under the counter if you don't have much space.
It bothered me the AI BS, but the price was good and the machine works fine.
I worked in a previous job on a product with 'AI' in the name. It was a source of amusement to many of us working there that the product didn't, and still doesn't use any AI.
> The AI label itself commands a price premium.
In the minds of some CEOs and VCs, maybe.
As for consumers, the AI label is increasingly off-putting.[0]
[0] https://www.tandfonline.com/doi/full/10.1080/19368623.2024.2...
> Smart Laundry with LG's AI Washing Machines: Efficient Spin Cycles & Beyond
Finally, the perfect example of AI-washing.
Reminds me of a "Washing Machine Trategy" by Stanisław Lem. A short story that may be a perfect parabole of the today's AI bubble.
You must be referring only to security bugs because you would quickly toss Excel or Photoshop if it were filled with performance and other bugs. Security bugs are a different story because users don't feel the consequences of the problem until they get hacked and even then, they don't know how they got hacked. There are no incentives for developers to actually care.
Developers do care about performance up to a point. If the software looks to be running fine on a majority of computers why continue to spend resources to optimize further? Principle of diminishing returns.
I wouldn't be so sure. People will rename genes to work around Excel bugs.
A big part of why I like shopping at Costco is that they generally don't sell garbage. Their filter doesn't always match mine, but they do have a meaningful filter.
> This is already true and will become increasingly more true for AI. The user cannot differentiate between sophisticated machine learning applications and a washing machine spin cycle calling itself AI.
The user cannot but a good AI might itself allow the average user to bridge the information asymmetry. So as long as we have a way to select a good AI assistant for ourselves...
> The user cannot but a good AI might itself allow the average user to bridge the information asymmetry. So as long as we have a way to select a good AI assistant for ourselves...
In the end it all hinges on the users ability to assess the quality of the product. Otherwise, the user cannot judge whether an assistant recommends quality products and the assistant has an incentive to suggest poorly (e.g. sellout to product producers).
> In the end it all hinges on the users ability to assess the quality of the product
The AI can use tools to extract various key metrics from the product that is analysed. Even if we limit such metrics down to those that can be verified in various "dumb" ways we should be able to verify products much further than today.
> The AI label itself commands a price premium.
These days I feel like I'd be willing to pay more for a product that explicitly disavowed AI. I mean, that's vulnerable to the same kind of marketing shenanigans, but still. :-)
Ha! You're totally right.
That's generally what I think as well. Yes, the world could run on older hardware, but you keep making faster and adding more CPU's so, why bother making the code more efficient?
The argument that a buyer can't verify quality is simply false. Specially if the cost of something is large. And for lots of things, such verification isn't that hard.
The Market for Lemons story is about a complex thing that most people don't understand and is to low value. But even that paper misses many real world solution people have found for this.
> The user cannot differentiate between sophisticated machine learning applications and a washing machine spin cycle calling itself AI.
Why then did people pay for ChatGPT when Google claimed it had something better? Because people quickly figured out that Google solution wasn't better.
Its easy to share results, its easy to look up benchmarks.
The claim that anything that claims its AI will automatically be able to demand some absurd prices is simply not true. At best people just slap AI on everything and its as if everybody stands up in a theater, nobody is better off.
Bad software is not cheaper to make (or maintain) in the long-term.
There are many exceptions.
1. Sometimes speed = money. Being the first to market, meeting VC-set milestones for additional funding, and not running out of runway are all things cheaper than the alternatives. Software maintenance costs later don't come close to opportunity costs if a company/project fails.
2. Most of the software is disposable. It's made to be sold, and the code repo will be chucked into a .zip on some corporate drive. There is no post-launch support, and the software's performance after launch is irrelevant for the business. They'll never touch the codebase again. There is no "long-term" for maintenance. They may harm their reputation, but that depends on whether their clients can talk with each other. If they have business or govt clients, they don't care.
3. The average tenure in tech companies is under 3 years. Most people involved in software can consider maintenance "someone else's problem." It's like the housing stock is in bad shape in some countries (like the UK) because the average tenure is less than 10 years. There isn't a person in the property's owner history to whom an investment in long-term property maintenance would have yielded any return. So now the property is dilapidated. And this is becoming a real nationwide problem.
4. Capable SWEs cost a lot more money. And if you hire an incapable IC who will attempt to future-proof the software, maintenance costs (and even onboarding costs) can balloon much more than some inefficient KISS code.
5. It only takes 1 bad engineering manager in the whole history of a particular piece of commercial software to ruin its quality, wiping out all previous efforts to maintain it well. If someone buys a second-hand car and smashes it into a tree hours later, was keeping the car pristinely maintained for that moment (by all the previous owners) worth it?
And so forth. What you say is true in some cases (esp where a company and its employees act in good faith) but not in many others.
The other factor here is that in the number-go-up world that many of the US tech firms operate in, your company has to always be growing in order to be considered successful, and as long as your company is growing, future engineer time will always be cheaper than current engineering time (and should you stop growing, you are done for anyway, and you won't need those future engineers).
Thanks for the insightful counter-argument.
What does "make in the long-term" even mean? How do you make a sandwich in the long-term?
Bad things are cheaper and easier to make. If they weren't, people would always make good things. You might say "work smarter," but smarter people cost more money. If smarter people didn't cost more money, everyone would always have the smartest people.
"In the long run, we are all dead." -- Keynes
In my experiences, companies can afford to care about good software if they have extreme demands (e.g. military, finance) or amortize over very long timeframes (e.g. privately owned). It's rare for consumer products to fall into either of these categories.
That’s true - but finding good engineers who know how to do it is more expensive, at least in expenditures.
Maybe not, but that still leaves the question of who ends up bearing the actual costs of the bad software.
Therefore brands as guardians of quality .
I actually disagree a bit. Sloppy software is cheap when you're a startup but it's quite expensive when you're big. You have all the costs of transmission and instances you need to account for. If airlines are going to cut an olive from the salad why wouldn't we pay programmers to optimize? This stuff compounds too.
We're currently operate in a world where new features are pushed that don't interest consumers. While they can't tell the difference between slop and not at purchase they sure can between updates. People constantly complain about stuff getting slower. But they also do get excited when things get faster.
Imo it's in part because we turned engineers into MBAs. Wherever I ask why can't we solve a problem some engineer always responds "well it's not that valuable". The bug fix is valuable to the user but they always clarify they mean money. Let's be honest, all those values are made up. It's not the job of the engineer to figure out how much profit a big fix will result in, it's their job to fix bugs.
Famously Coke doesn't advertise to make you aware of Coke. They advertise to associate good feelings. Similarly, car companies advertise to get their cars associated with class. Which is why sometimes they will advertise to people who have no chance of buying the car. What I'm saying is that brand matters. The problem right now is that all major brands have decided brand doesn't matter or brand decisions are always set in stone. Maybe they're right, how often do people switch? But maybe they're wrong, switching seems to just have the same features but a new UI that you got to learn from scratch (yes, even Apple devices aren't intuitive)
the thing is - countries have set down legal rules preventing selling of food that actively harms the consumer(expired, known poisonous, addition of addictive substances(opiates) etc) to continue your food analogy.
in software the regulations can be boiled down to 'lol lmao' in pre-GDPR era. and even now i see GDPR violations daily.
My partner was diagnosed with Parkinson’s almost 5 years ago. His disease has progressed significantly in the past year, and he begun to have delusions. He also had side effects from carbidopa/levodopa, which we decided to stop, and our primary physician decided he should start on PD-5 formula 4 months ago from UINE HEALTH CENTER. He now sleeps soundly, works out frequently, and is now very active since we started him on the PD-5 formula. It doesn’t make the Parkinson’s disease go away, but it did give him a better quality of life. We got the treatment from www. uineheathcentre. com