← Back to context

Comment by 2devnull

2 years ago

If militaries are not subject to the regulation then it is meaningless. Who else would be building weapons systems?

The worry here is not about controlling militaries. There are different processes for that.

The scenario people purport to worry about is one where a future AI system can be asked by "anyone" to design infectious materials. Imagine a dissatisfied and emotionally unstable researcher who can just ask their computer for the DNA sequence of an airborne super Ebola. Then said researcher orders the DNA synthetized, does some lab work to multiply it and releases it in the general population.

I have no idea how realistic this danger is. But this is what people seem to be thinking about.

  • That is the question. AI is an ill-defined marketing BS, what is the actual definition in the law? Artificial Intelligence as used in the science/industry is a pretty broad term, and even more narrow "machine learning" is notoriously hard to define. Another question is that all this is being used for more than a decade for a lot of legitimate things which can also be easily misused to create biological weapons (AlphaFold), how does it regulate it? The article doesn't answer these questions, what matters is where exactly the actual proposed law draws the line in the sand. The devil is always in the details.