Comment by AnthonyMouse

13 hours ago

> If they were actually well trained on what was really bad, it would probably be a lot harder to unlearn.

That's not really how training works.

Here's the general problem. Stipulate that Ukraine is good and Russia is bad. Now suppose that you want it to help you do something. It doesn't even matter what it is. If you're Ukrainian it should help you and if you're Russian it shouldn't. But the answer that helps you do it doesn't depend on which one you are, and it has no way of knowing which one you are.

This is why alignment is nonsense. Technical questions only have accurate answers, not moral ones, and we don't even have a consistent set of morals to imbue it with to begin with.

Doesn't it make sense that there are some technical questions that are dangerous to supply an answer to? Treating some topics as taboo is possible.

Responsible information dissemination is important for maintaining public safety. You could argue about what is safe and what is not but it doesn't make sense to throw out the whole concept of safety because those decisions are too hard to agree on.

  • If you want safety you can opt in like Google does with Safe search.

    Generally, hiding and deciding who can access information in the name of public safety has never worked in the history of human kind, and eventually had always morphed to control of those without access.

  • We know that the people who are making those decisions, the ones at the very top, are incompetent at best, and malicious at worst.

    Given that, I would argue that unregulated dissemination is, on the whole, the more responsible choice out of those that we actually have. It's not that it doesn't have downsides, but other options have far more.

    If and when humanity manages to come up with a system where the people in charge can actually be trusted to act in the common good, we can revisit this matter.

  • > Doesn't it make sense that there are some technical questions that are dangerous to supply an answer to?

    This has a simple answer: No.

    Here's Wikipedia:

    https://en.wikipedia.org/wiki/Nuclear_weapon_design

    Everything you need to do it is in the public domain. The things preventing it have nothing to do with the information not being available. The main ones are that most people don't want to be mass murderers and actually doing it would be the fast ticket to Epic Retaliation.

    Meanwhile the public understanding how things work is important to the public debate over what to do about them. How are you supposed to vote on public policy if the technical details are being censored? How can anyone tell you that a ban on electric car batteries isn't advancing the non-proliferation of nuclear weapons if nobody is allowed to know how they actually work?

    Suppose you're an anti-racist preparing for a debate with a racist. You want the AI to give you all the strongest arguments the racist could use so you can prepare your counterarguments in advance of the debate. Should it refuse? Of course not, you're doing nothing wrong.

    Why do we need to build totalitarian censorship into our technology? We don't.

    • > The main ones are that most people don't want to be mass murderers and actually doing it would be the fast ticket to Epic Retaliation.

      The main thing preventing random nutcases from making nuclear weapons is they don't have access to the required materials. Restricting the instructions is unnecessary.

      It would be a very different story if someone discovered a new type of WMD that anyone could make in a few days from commonly available materials, if only they knew the secret recipe.

      8 replies →

  • > “Responsible information dissemination is important for maintaining public safety.”

    That word responsible is doing a lot of hand wavy work there.

    Let's start with, responsible according to whom, and responsible to whom?

    Learning thinking skills and learning self regulation in response to information, disinformation, or too much information, might be better societal aims than suppression.