← Back to context

Comment by doesnt_know

6 days ago

The fact your friend is suffering no consequences and is able to just carry on is exactly what is wrong with this industry.

In a perfect world the creation of software would have been locked down like other engineering fields, with developers and companies suffering legal consequences for exposing customer information.

The 80s and 90s devs who built our current software infra were, on average, FAR less credentialed than today's juniors and mids who mostly don't understand what they're building on.

  • Sure, and Da Vinci didn't have an architectural degree when he was designing bridges, but now you need a proper license to do so. Society learns to do better

  • The difference is we didn’t know any better back then. We do now.

    • Surprisingly, 80's and 90's developers were quite skilled low-level developers who knew very well all the ways things could go wrong. The difference was the stakes were not high then. The blast radius was maybe a hundred thousand people and the worst was they lost their own files. Now some AI-controlled process or apparatus could ruin everyone's credit and maybe even kill you and all your neighbors.

      1 reply →

In our imperfect world, by the time the government could get together a reasonable certification process the content you're tested on would be out of date. Maybe when the industry is older it'll change slow enough to do that, but I don't think that'll happen so long as there's so much money aimed at disrupting everything and monetising the disruption.

Were going in circles far too fast to have licensure that hinges on being up to date.

  • That's what tort law is for. It leaves the details to the experts, and judges based on general notions of intent, negligence, and harm caused. The threat of financial ruin should incentivize against selling malware.

Let's say it was coded extremely well, but nevertheless a more advanced exploiter wreaked similar havoc. Would they still be liable in your perfect world? To some degree the principle of caveat emptor should apply in some tiny, nascent business, otherwise only large juggernaut monopolistic incumbents would have the means to have any stake in software.

  • > Let's say it was coded extremely well, but nevertheless a more advanced exploiter wreaked similar havoc.

    A doctor kills a patient because malpractice. Could that patient have died anyway if the patient had a more critical condition?

    That is a non sequitur argument.

    > Would they still be liable in your perfect world?

    Yes. The doctor would be liable because did not meet the minimum quality criteria. In the same way that the developer is liable for not taking into account any risks and providing a deeply flawed product.

    It is impossible in practice to protect software from all possible attacks as there are attackers with very deep pockets. That does not mean that all security should be scrapped.

    • Yes, parent is arguing like, what if medical licensing protects the juggernaut hospitals at the expense of the street corner quack?

      1 reply →

  • Imagine these two scenarios:

    Your spouse dies in surgery. The highly experienced surgeon made a mistake, because, realistically, everyone makes mistakes sometimes.

    Your spouse dies in surgery. The hospital handed a passing five year old a scalpel to see what would happen.

    There's a clear difference; neither are _great_, but someone's probably going to jail for the second one.

    In real, regulated professions, no-one's expecting absolute perfection, but you're not allowed to be negligent. Of course, 'software engineer' is (generally) _not_ a real, regulated profession. And vibe-coding idiot 'founder' certainly isn't.

    • There is a word for this, negligence. We need to start considering these failures to secure user data as criminal negligence.

  • That's always the double-edged sword with regulation, but sooner or later people will demand it, or much more of it.

I don't remember the specifics well, but under GDPR they'd be required to give breach notification to customers, maybe write a report and get audited and possibly get fined depending on the situation. Customers could demand compensation (probably doesn't make sense here).

Right. Because the solution to all of this madness is SOC2 compliance or something along those lines.

What happened is a perfect natural selection. The friend is a very small actor with probably a dozen customers not a multi-billion $$ company with millions of customers.

Well his customers got a refund, that's nice ;)

But I guess the lesson is to vibe code to test the market while factoring a real developer cost upfront and hiring one as soon as the product gets traction.

Imagine vibe coding spreads to civil engineering and people start building bridges this way. Have AI design it and then probably 3D print it on location.

> legal consequences for exposing customer information.

Still a good idea. Also without taking vibe coding into account. Far too many tech companies are way too sloppy with customer data. Often intentionally so.

In that world we’d just be transitioning to 32-bit software and still running MS-DOS since it’s certified. Linux would never ever have broken through. Who can trust code developed by open source cowboys? Have we verified all their credentials?

There are some industries where the massive cost of this type of lock down — probably innovation at 1/10th the speed at 100X the cost — is needed. Medicine comes to mind. It’s different from software in two ways. One is that the stakes are higher from a human point of view, but the more significant difference is that ordinary users of medicine are usually not competent to judge its efficacy (hence why there’s so much quackery). It has an extreme case of the ignorant customer problem, making it hard for the market to work. The users of software usually can see if it’s working.

  • You, of course, say that like it's a bad thing.

    I'll say video games would certainly be worse.

    I don't know if we'd be worse off with a lot of other software and/or public internet sites of 20-to-30 years ago. A lot of people are unhappy with the state of modern consumer software, ad surveillance, etc.

    Probably a lot less identity theft and credit card/banking fraud.

    For social media, it depends on if that "regulate things to ensure safety" attitude extends to things like abuse/threats/unsolicited gore or nudes/etc. And advertising surveillance. Would ad tracking be rejected since the device and platform should not be allowed to share all that fingerprinting stuff in the first place, or would it just be "you can track if you check all the data protection boxes" which is not really that much better.

    I'm sure someone would've spent the time to produce certified Linux versions by now; "Linux with support" has been a business model for decades, and if the alternative is pay MS, pay someone else, or write your own from scratch, there's room in the market.

    (Somewhere out there there's another counterfactual world where medicine is less regulated and the survivors who haven't been victimized by the resulting problems are talking about how "in that other world we'd still be getting hip replacement surgery instead of regrowing things with gene therapy" or somesuch...)

    • A lot of the things people are upset about are not related to this issue and not something licensing engineers would fix. They're products of things like market incentives.

      What you're really talking about when you talk about "locking down the field" is skipping or suppressing the PC revolution. That would make things like opaqueness and surveillance worse, not better. There would be nothing but SaaS and dumb terminals at the endpoint and no large base of autodidact hacker types to understand what is happening.

      I have wondered if medicine wouldn't be a lot more advanced without regulation, but I tend to think no. I think we have the AB test there. There are many countries with little medical regulation (or where it is sparsely enforced) and they do not export sci-fi transhumanist medical tech. They are at best no better than what more regulated domains have. Like I said, I think many things about medicine are very different from software. They're very different industries with very different incentives, problem domain characteristics, and ethical constraints. The biggest difference, other than ethics, is that autodidactism is easy in software and almost impossible in medicine, for deep complicated practical as well as ethical reasons.

      For software we do have the AB test. More conservative software markets and jurisdictions are vastly slower than less conservative ones.

  • I think you're proving their point. There are different kinds of software that require different kinds of regulation.

  • I disagree. Mature open source projects last long enough without significant disruption to still be relevant after they make it onto the certification exam. Products, not so much.

    Investing time building familiarity with proprietary software is already a dubious move for a lot of other reasons, but this would be just one more: why would I build curriculum around something that I'm just going to have to change next year when the new CEO does something crazy?

    And as bad as it might be for many of us who hang out here, killing off proprietary software would be a great step forward.

    • You're assuming the process would not be instantly subjected to regulatory capture by for-profit companies and by universities with an interest in inserting themselves into the required licensure pipeline.

      Microsoft in the 1990s would have used the regulatory and licensure process to shut down open source. They tried to do it with bullshit lawsuits.

  • A reliable, un-bloated OS? Sign me the eff up.

    • Go check out VxWorks or the like. only 20K a seat, build tools at a similar price, and then oh joy, runtime licenses required to deploy the sw you wrote.

      Which are reasonable prices when lives are at risk.

      Yes, I know RTOS are not general purpose, this is NOT apples to apples, but that is what that kind of reliability, testing, safety certification, etc. costs.

      1 reply →

...and this is where compliance comes in, and is the exact reason real companies won't talk to you unless you have (at minimum) SOC2. There's billions of products out there, how do you know if it's actually good software developed by a team, or some idiot like above vibe-coding slop into what appears to be a functional application? We all make fun of audits and checklist-based-security but it would've almost certainly prevented the above from happening.

how dare you stifle innovation with your communist laws, I thought this was America

People really want to bring down the growth of the USA's software industry to EU level.

  • Letting another country be the wild west and then cherry-picking the good stuff while regulating the nasty stuff doesn't seem like a terrible place to be for the, what, 99% of people who aren't Silicon-Vally-bigtech-execs-and-engineers getting all those profits?

    Even in the US most software jobs are lower-scale and lower-ROI than a company that can serve hundreds of millions of users from one central organization.

    But for the engineers/investors in other countries... I think the EU, etc, would do well to put more barriers up for those companies to force the creation of local alternatives in those super-high-ROI areas - would drive a lot of high-profit job- and investment-growth which would lead to more of that SV-style risk-taking ecosystem. Just because one company is able, through technology, to now serve everyone in the world doesn't mean that it's economically ideal for most of the world.

  • > People really want to bring down the growth of the USA's software industry to EU level.

    The EU is the only place hiring software engineers right now. Everyone in the U.S. just keeps laying them off.

    • That hiring is by US companies moving at US speeds, who greatly eclipse the growth rate of EU companies, which is the point OP was making.

      2 replies →

    • Some of that is US companies hiring in the EU because the salaries are lower. Source: I know of multiple companies, even on the smaller side, doing this.

      1 reply →

  • It's a textbook case study of market failure in neoclassical economics caused by information asymmetry. If customers knew about the vulnerabilities, they wouldn't have paid money, or they would have demanded a lower price.