← Back to context

Comment by jmward01

17 hours ago

The thing that worries me isn't the drone/anti-drone escalation. It is the fact that these weapons aren't actually limited to anti-drone use. Recently we have seen clear examples of countries, including Israel, that will use automatic id technology to mass tag a population. If you then have tools that can automatically track and mass kill, which this type of weapon represents, then we have reached a type of warfare that is new in the world and deeply scary. It isn't hard to imagine a scenario where person x is killed since they are marked as a 'bag guy' and as part of being marked every person they were next to for the last few days was also marked as likely enough to be bad guys to kill as well. All that has to be done is push a button. It is a scary, and unfortunately all to possible, future if not now.

It's been possible for a long time.

For antipersonnel use, guns are perfectly adequate and guns on tracking turrets have been widely deployed (for example, CIWS). The underlying technology is a ballistic calculator and a fast panning turret. Modern ballistic calculators, weather stations (a small device about the size of a cellphone), and good quality ammunition allows for incredible precision with small arms -- hitting something 25cm in diameter at 1000m is something people can do with these tools.

A weapon like this can't really "mass kill" -- it is for point targets -- but we have long had tools that can automatically track and kill. Why don't we employ them to shoot at people? We have the tagging technology, &c, as you mention.

One reason is that positive identification really does matter a lot when designing and developing weapon systems that automatically attack something.

The anti-missile use case is one of the most widespread uses for automatically targeted weapons in part because a missile is easily distinguished from other things that should not be killed: it is small, extremely hot, moves extremely fast, generally up in the air and moves towards the defense system. It is not a bird, a person, or even a friendly aircraft. The worst mistake the targeting system can make is shooting down a friendly missile. If a friendly missile is coming at you, maybe you need to shoot it down anyways...

Drones have a different signature from a missile and recognizing them in a way that doesn't confuse them with a bird, a balloon, &c, is different from recognizing missiles -- but here again, the worse thing that happens is you shoot down a friendly drone.

  • And note an advantage to lasers--when you fire ordinary stuff it falls back. C-RAM is specifically designed that misses detonate while still in the air, but no munition has a 100% fusing rate, you get duds. Nothing falls back from the laser.

  • CWIS is pretty massive, not that this isn't still big, but I think this is taking a miniaturization turn, is upping the accuracy and number of engagements it can handle significantly and potentially upping the range especially in urban environments. CWIS in an urban environment would cause chaos and a lot of collateral damage to buildings but you can now be very sure that only your intended target is being hit so people could die without all the optics of buildings crashing down. It is much easier to have a war when the cameras don't see the destruction. Positive ID is huge, if you really care about it, but even with perfect positive ID if a government is ok with genocide then everyone is a valid target. Are you a male older than 13? You are a combatant and will be killed once you are in sight. Did someone help you in any way (like your mother of family giving you food?) They are also combatants. It is unfortunately not a stretch with modern tools to see this happening in real time. This weapon is, unfortunately, on an inevitable path.

It seems incredibly hard to imagine what else you would do with a ground based laser other than shoot at incoming projectiles. What exactly are you expecting the Israelis to do? Change the laws of physics?

Truth is it's already happening, this is how "Lavender" and "Where's Daddy" were used to collectively punish entire families of what a poorly trained AI model thought may or may not be a Hamas fighter

  • Evidence?

    And the Lavender system was only deployed once it was doing as good as the humans. It isn't 100%, war never is.