← Back to context

Comment by simonw

17 hours ago

This is one of the reasons I'm so interested in sandboxing. A great way to reduce the need for review is to have ways of running code that limit the blast radius if the code is bad. Running code in a sandbox can mean that the worst that can happen is a bad output as opposed to a memory leak, security hole or worse.

Isn’t “bad output” already worst case? Pre-LLMs correct output was table stakes.

You expect your calculator to always give correct answers, your bank to always transfer your money correctly, and so on.

  • > Isn’t “bad output” already worst case?

    Worst case in a modern agentic scenario is more like "drained your bank account to buy bitcoin and then deleted your harddrive along with the private key"

    > Pre-LLMs correct output was table stakes

    We're only just getting to the point where we have languages and tooling that can reliably prevent segfaults. Correctness isn't even on the table, outside of a few (mostly academic) contexts

    • > Worst case in a modern agentic scenario is more like "drained your bank account to buy bitcoin and then deleted your harddrive along with the private key"

      Hence my interest in sandboxes!

    • > drained your bank account to buy bitcoin and then deleted your harddrive

      These are what I meant by correct output. The software does what you expect it to.

      > We're only just getting to the point where we have languages and tooling that can reliably prevent segfaults

      This is not really an output issue IMO. This is a failing edge case.

      LLMs are moving the industry away from trying to write software that handles all possible edge cases gracefully and towards software developed very quickly that behaves correctly on the happy paths more often than not.

  • I've seen plenty of decision makers act on bad output from human employees in the past. The company usually survives.

And if the bad output leads to a decision maker making a bad decision, that takes down your company or kills your relative ?