← Back to context

Comment by pgroves

16 hours ago

This is sort of why I think software development might be the only real application of LLMs outside of entertainment. We can build ourselves tight little feedback loops that other domains can't. I somewhat frequently agree on a plan with an LLM and a few minutes or hours later find out it doesn't work and then the LLM is like "that's why we shouldn't have done it like that!". Imagine building a house from scratch and finding out that it was using some american websites to spec out your electric system and not noticing the problem until you're installing your candadian dishwasher.

I don't understand why the experience you describe would lead you to conclude that LLMs might be useful for software development.

The response "that's why we shouldn't have done it like that!" sounds like a variation on the usual "You're absolutely right! I apologize for any confusion". Why would we want to get stuck in a loop where an AI produces loads of absolute nonsense for us to painstakingly debug and debunk, after which the AI switches track to some different nonsense, which we again have debug and debunk, and so on. That doesn't sound like a good loop.

> Imagine building a house from scratch

Thats why those Engineering fields have strict rules, often require formal education and someone can even end up in prison if screws up badly enough.

Software is so much easier and safer, till very recently anonymous engineering was the norm and people are very annoyed with Apple pushing for signing off the resulting product.

Highly paid software engineers across the board must have been an anomaly that is ending now. Maybe in the future only those who code actually novel solutions or high risk software will be paid very well - just like engineers in the other fields.

  • > people are very annoyed with Apple pushing for signing off the resulting product.

    Apple is very much welcome to push for signing off of software that appears on their own store. That is nothing new.

    What people are annoyed about is Apple insisting that you can only use their store, a restriction that has nothing to do with safety or quality and everything to do with the stupendous amounts of money they make from it.

    • It's literally the case of Apple requiring signing the binary to run on the platforms they provide, Apple doesn't have say on other platforms. It is a very similar situation with local governments.

      Also, people complain all the time about rules and regulations for making stuff. Especially in EU, you can't just create products however you like and let people decide if it is safe to use, you are required to make your products to meet certain criteria and avoid use certain chemicals and methods, you are required to certify certain things and you can't be anonymous. If you are making and selling cupcakes for example and if something goes wrong you will be held responsible. Not only when things go wrong, often local governments will do inspections before letting you start making the cupcakes and every now and then they can check you out.

      Software appears to be headed to that direction. Of course du to the nature of software probably wouldn't be exactly like that but IMHO it is very likely that at least having someone responsible for the things a software does will become the norm.

      Maybe in the future if your software leaks sensitive information for example, you may end up being investigated and fined if not following best practices that can be determined by some institute etc.

      4 replies →

  • Software developers being paid well is result of demand, not be cause it's very hard.

    Skill and strictness required is only vaguely related to pay, if there is enough people for the job it won't pay amazing, regardless on how hard it is.

    > Software is so much easier and safer, till very recently anonymous engineering was the norm and people are very annoyed with Apple pushing for signing off the resulting product.

    that has nothing to do with engineering quality, that is just to make it harder to go around their ecosystem (and skip paying the shop fee). With additional benefit of signed package being harder to attack. You can still deliver absolute slop, but the slop will be from you, not the middleman that captured the delivery process

Thinking about this some more, maybe I wasn't considering simulators (aka digital twins), which are supposed to be able to create fairly reliable feedback loops without building things in reality. Eg will this plane design be able to take off? Still, I feel fortunate I only have to write unit tests to get a bit of contact with reality.

> This is sort of why I think software development might be the only real application of LLMs outside of entertainment.

Wow. What about also, I don't know, self-teaching*? In general, you have to be very arrogant to say that you've experienced all the "real" applications.

* - For instance, today and yesterday, I've been using LLMs to teach myself about RLC circuits and "inerters".

  • I would absolutely not trust an LLM to teach me anything alone. I've had it introduce ideas I hadn't heard about which I looked up from actual sources to confirm it was a valid solution. Daily usage has shown it will happily lead you down the wrong path and usually the only way to know that it is the wrong path, is if you already knew what the solution should be.

    LLMs MAY be a version of office hours or asking the TA, if you only have the book and no actual teacher. I have seen nothing that convinces me they are anything more than the latest version of the hammer in our toolbox. Not every problem is a nail.

  • Self-teaching pretty much doesn't work. For many decades now, the barrier has not been access to information, it's been the "self" part. Turns out most people need regimen, accountablity, strictness, which AI just doesn't solve because it's yes-men.

    • > Self-teaching pretty much doesn't work. For many decades now, the barrier has not been access to information, it's been the "self" part.

      That’s a complete bogus. And LLMs are yes men by default, nothing stops you from overriding initial setting.

  • Why would you think that a machine known to cheerfully and confidently assert complete bullshit is suitable to learn from?

  • It’s somewhat delusional and potentially dangerous to assume that chatting with an LLM about a specific topic is self-teaching beyond the most surface-level understanding of a topic. No doubt you can learn some true things, but you’ll also learn some blatant falsehoods and a lot of incorrect theory. And you won’t know which is which.

    One of the most important factors in actually learning something is humility. Unfortunately, LLM chatbots are designed to discourage this in their users. So many people think they’re experts because they asked a chatbot. They aren’t.

    • I think everything you said was true 1-2 years ago. But the current LLMs are very good about citing work, and hallucinations are exceedingly rare. Gemini for example frequently directs you to a website or video that backs up it's answer.

    • > It’s somewhat delusional and potentially dangerous to assume that chatting with an LLM about a specific topic is self-teaching beyond the most surface-level understanding of a topic

      It's delusional and very arrogant of you to confidently asserts anything without proof: A topic like RLC circuits has got a body of rigorous theorems and proofs underlying it*, and nothing stops you from piecing it together using an LLM.

      * - See "Positive-Real Functions", "Schwarz-Pick Theorem", "Schur Class". These are things I've been mulling over.

It's more like you're installing the dishwasher and the dishwasher itself yells at you "I told you so" ;)

  • I think of it as you say "install dishwasher" and it plan looks like all the steps but as it builds it out it somehow you end up hiring a maid and buying a drying rack.