← Back to context

Comment by slowmovintarget

2 years ago

Machine learning is a general use tool. It's like Socrates decrying writing as harmful (which we only know of because Plato wrote it down).

You cannot use any of those weapons you mention as anything other than weapons. LLMs, diffusion nets, and classification systems have general use: in medicine, in business, in software engineering, in science, in marketing. These machine learning systems are hyper-advanced printing presses. I'm sure many of the world's governments consider that exceedingly dangerous.

Firearms, biological weapons, nuclear weapons, and chemical weapons all have a single use: to kill people or destroy things. Can you put ML components in to weapons systems? Yes. But that is the same as controlling weapons systems with software, and we don't outlaw all software because some of it could be used to control weapons systems.

ML components are software. Advanced software, not even close to "AI" or, since we've lost that term to marketers, AGI. This regulation is like asking the team making a compiler for $language to ensure that the compiler cannot be used to make malicious software. It's silly on the face of it.