Comment by spaceman_2020
1 year ago
Some of this stuff is getting to the point where we will seriously need to have a global talk on whether we should put a pin in this tech or not
1 year ago
Some of this stuff is getting to the point where we will seriously need to have a global talk on whether we should put a pin in this tech or not
The child comments from yours are mentioning nuclear weapons as a parallel but there's one big difference between drone tech and nuclear weapons: plutonium is really hard to make.
We might be able to put a pin in this tech from a policy perspective, but the cat is way out of the bag as far as the tech goes. A cell phone already has all of the sensors you need baked right into it (honestly, we can thank mobile devices for getting the cost down). An ESC for a motor is a cheap microcontroller and a couple of MOSFETs. The frames can be made of cheap plastic. Even if things like ArduPilot didn't exist, a smart EE student could build one from scratch, including the flight control software, using parts from Digikey and relatively basic PID control code.
The cat is definitely out of the bag.
A lunatic will be able to wipe out school children playing outside and have little chance of getting caught, for example.
Nice.
Yes, and so far it's much easier to drive a van into a crowd of people. Nobody has tried to mandate tech in cars that detects and prevents such malicious behavior.
6 replies →
> A lunatic will be able to wipe out school children playing outside and have little chance of getting caught, for example.
America insists on making sure that guns are universally available so that school shootings can still happen. Doesn't register. The death toll seems to be politically acceptable.
Generally, if you are smart enough to fashion this without being caught, you are too smart to do something like that.
Plus, you got a cool and potentially lucrative hobby, designing exterminator machines. Why bother with children at that point?
There are much, much better targets to be had.
Your point on the dwindling barrier to implementation stands.
6 replies →
I mean... yeah, that’s a definite possibility. If a lunatic has access to explosives, there’s an infinite number of ways they could do that.
The hard part is that there is no effective way to regulate anything in the supply chain involved except for the explosives themselves. Everything else is super commoditized at this point and, other than the props, very multi-purpose. The first significant hexcopter I built used a BeagleBone Blue for processing, generic ESCs and BLDCs for the motors, and an aluminum frame that I cut out of aluminum tubes from Home Depot. Max takeoff weight was 55lb, because that’s the heaviest it could legally take off with. This was 7 years ago.
1 reply →
That’s why only the strong communities with strong families will survive, because even lunatics are cared for in strong community structures.
I'm sure that everyone would agree on that, and that $bad_actor wouldn't take advantage of the fact that everyone else had agreed to lay down their arms. Game theory sucks, but it's hard to get around.
There wouldn't be any pin in it. Drones - automated weapons in the wide sense - will be the new MAD/equalizer weapon accessible to smaller countries who have no chances of getting into the nuclear club. Without such a weapon in the coming new world order - marked specifically by the USA's withdrawal from enforcing international law - they will be an easy prey to the bigger countries. Ukraine is just a preview of that equalizing power.
I guess it falls on me to break it to you then but serious "global talks" happen at the exploding end of ordinance.
There is no Jedi Council to appeal to, no wise group of non-aggressive nations gathering to pacify the troublemakers.
why? if nuclear weapons got the green light, do you expect a different outcome?
Weapon that threatens everyone is better than weapon that threatens only some
Because nuclear weapons got the green light.
As if the billionaires won't simply go "F that noise, more money for me!!!" Ethical concerns are way down the priority list for most AI focused companies.