← Back to context

Comment by stego-tech

18 days ago

I really, really need folks to understand that deflecting blame away from the tool and trying to hold the human accountable feeds right into the marketing playbook of these companies in the first place.

The cops cannot be held accountable because the laws basically give them immunity. The politicians cannot be held accountable beyond being tossed out at the next election, because the laws otherwise give them immunity. The people operating the system cannot be held accountable, because the systems are marketed as authoritative despite being black boxes and lacking in transparency; they trusted the system just as they were told to, and thus cannot be held accountable.

And so when every human in the chain cannot be held accountable for these things, and the law prevents victims from receiving apologies, let alone recourse, then the tool and its maker is the only thing we can hold accountable. By deflecting blame away from the tools ("it wasn't AI, it was facial recognition"; "the human had to sign off on it"; "humans made the arrest, not machines"), you're protecting quite literally the only possible entity that could still potentially be held accountable: the dipshits making these stupid things and marketing them as superior and authoritative when compared to humans.

You want accountability? Start holding capital to account, and this shit falls away real fucking fast. Don't get lost in technical nuance over very real human issues.

I disagree. If you focus on holding the software creators to account in lieu of the humans in the loop, the we only reinforce the behavior of offloading thinking to the system.

If I am a cop in another jurisdiction and I see that in this case of error, the facial recognition company was held to account but not the police or municipality, I will be more likely to blindly trust the software assuming that they either patched it or will take responsibility.

We should demand accountability for both.

You can blame both. The prosecutors and police that didn't do their proper due diligence, falsely imprisoning this woman, and held her for months without due process. And also the AI company that submitted a false police report and the defamation of character. There's no reason for either of them to escape the blame.

>Start holding capital to account

You forgot one: capital cannot be held accountable for making a tool used in a crime. It is a simple generalization of the Protection of Lawful Commerce in Arms Act (PLCAA), passed in 2005, which largely bars civil lawsuits against gun makers and sellers when their products are later used in crime.

Strongly agree here. This is an extremely predictable outcome of selling AI facial recognition software to American police forces.

Is there anything to suggest this sort of injustice isn't happening in low-tech all the time, constantly, all over the country, and the only reason it's getting attention here is because AI is involved?

  • The scale is not the same. Low-tech tools require more human input, more pre-filtering of suspects. They can't just default to starting with "everybody" and match against millions at the push of a button.