Comment by sassymuffinz

6 days ago

So they did the math and worked out it's cheaper and easier to lobby the government instead of working to make their product safe.

And these are the people that a lot programmers want to give the keys to the kingdom. Idiocracy really is in full effect.

> instead of working to make their product safe

Make a nondeterministic product safe how?

  • I'm creating a new start up called QuantumFlop Electricity - there's a 10% chance it will cause a black hole to open up in the Atlantic Ocean that may eventually consume us all but a 50% chance we'll have unlimited clean energy. We'll never know for sure if at any point that black hole may open as it's borrowing energy from the 81st dimension, but the upside seems pretty good.

    Should I be able to get on with it?

    • Funny, I was just rereading the Hyperion series. It says there clearly that it was the AIs that created the black hole that led to the destruction of Old Earth. Intentionally.

      2 replies →

  • Is this the first time you have heard of AI safety?

    Lots of articles you could read on the subject and answer your own question.

    (Unless your angle is: akshually, you can never make anything 100% safe)

    • > akshually, you can never make anything 100% safe

      Yes Sherlock. And especially a natural language product that can't output the same thing for unchanged input twice.

      Besides when you say "safe" i think of the idiots at Anthropic deleting "the hell" when i pasted a string in Claude and asked "what the hell are those unprintable characters at the beginning and end"...

      How many correct answers did they suppress in their quest to make their chatbot "family friendly"?

      1 reply →

  • What exactly are you implying? It sounds to me like you're saying that if it's impossible to make a product safe, then there shouldn't be any safety requirements. I think a more sensible position is that if it's impossible to make a product safe, then it should be illegal to build.