Comment by throwaway123982

2 years ago

> LLMs do a great job of providing recipes for dinner but maybe shouldn't teach me how to build a bomb.

Why not? If someone wants to make a bomb, they can already find out from other source materials.

We already have regulations around acquiring dangerous materials. Knowing how to make a bomb is not the same as making one (which is not the same as using one to harm people.)

It's about access and command & control. I could have the same sentiment as you, since in high school, friends & I were in the habit of using our knowledge from chemistry class (and a bit more reading; waay pre-Internet) to make some rather impressive fireworks and rockets. But we never did anything destructive with them.

There are many bits of technology that can destroy large numbers of people with a single action. Usually, those are either tightly controlled and/or require jumping a high bar of technical knowledge, industrial capability, and/or capital to produce. The intersection of people with that requisite knowledge+capability+capital and people sufficiently psycopathic to build & use such destructive things approaches zero.

The same was true of hacking way back when. The result was interesting, sometimes fun, and generally non-destructive hacks. But now, hacking tools have been developed to the level of copy+paste click+shoot. Script kiddies became a thing. And we now must deal with ransomeware gangs of everything from nation-state actors down to rando teenage miscreants, but they all cause massive damage.

Extending copy+paste click+shoot level knowledge to bombs and biological agents is just massively stupid. The last thing we need is having a low intelligence bar required to have people setting off bombs & bioweapons on their stupid whims. So yes, we absolutely should restrict these kinds of recipe-from-scratch responses.

In any case, if you really want to know, I'm sure that, if you already have significant knowledge and smarts, you can craft prompts to get the LLM to reveal the parts you don't know. But this gets back to raising the bar, which is just fine.