Comment by orbital-decay
2 years ago
> They include requirements that the most advanced A.I. products be tested to assure that they cannot be used to produce biological or nuclear weapons
How is "AI" defined? Does this mean US nuclear weapons simulations will have to completely rely on hard methods, with absolutely no ML involved for some optimizations? What does it mean for things like AlphaFold?
What makes you think the US military will be subject to these regulations?
If militaries are not subject to the regulation then it is meaningless. Who else would be building weapons systems?
The worry here is not about controlling militaries. There are different processes for that.
The scenario people purport to worry about is one where a future AI system can be asked by "anyone" to design infectious materials. Imagine a dissatisfied and emotionally unstable researcher who can just ask their computer for the DNA sequence of an airborne super Ebola. Then said researcher orders the DNA synthetized, does some lab work to multiply it and releases it in the general population.
I have no idea how realistic this danger is. But this is what people seem to be thinking about.
1 reply →
Now that you mentioned it... Does it outlaw the Intel and AMD's amd64 branch predictors?
> Does it outlaw the Intel and AMD's amd64 branch predictors?
Does better branch prediction enable better / faster weapons development? Perhaps we need laws restricting general purpose computing? Imagine what "terrorists" could do if they get access to general purpose computing!