← Back to context

Comment by halfcat

19 hours ago

> If we do not specify formally what we expect a tool to do, how do we know whether the tool has done what we expected, including in edge cases?

You don’t. That’s the scary part. Up until now, this was somewhat solved by injecting artificial friction. A bank that takes 5 days for a payment to clear. And so on.

But it’s worse than this, because most problems software solves cannot even be understood until you partially solve the problem. It’s the trying and failing that reveals the gap, usually by someone who only recognizes the gap because they were once embarrassed by it, and what they hear rhymes with their pain. AI doesn’t interface with physical reality, as far as we know, or have any mechanism to course correct like embarrassment or pain.

In the future, we will have flown off the cliff before we even know there was a problem. We will be on a space ship going so fast that we can’t see the asteroid until it’s too la...