Comment by SpicyLemonZest
1 day ago
It's a hard concept in all kinds of scenarios. If a pharmacist sells you large amounts of pseudoephedrine, which you're secretly using to manufacture meth, which of you is responsible? It's not an either/or, and we've decided as a society that the pharmacist needs to shoulder a lot of the responsibility by putting restrictions on when and how they'll sell it.
sure but we’re talking about literal text, not physical drugs or bomb making materials. censorship is silly for LLMs and “jailbreaking” as a concept for LLMs is silly. this entire line of discussion is silly
Except it’s not, because people are using LLMs for things, thinking they can put guardrails on them that will hold.
As an example, I’m thinking of the car dealership chatbot that gave away $1 cars: https://futurism.com/the-byte/car-dealership-ai
If these things are being sold as things that can be locked down, it’s fair game to find holes in those lockdowns.
…and? people do stupid things and face consequences? so what?
I’d also advocate you don’t expose your unsecured database to the public internet
5 replies →