Comment by strgcmc

6 years ago

I think the NYT article has a little more detail: https://www.nytimes.com/2020/06/24/technology/facial-recogni...

Essentially, an employee of the facial recognition provider forwarded an "investigative lead" for the match they generated (which does have a score associated with it on the provider's side, but it's not clear if the score is clearly communicated to detectives as well), and the detectives then put the photo of this man into a "6 pack" photo line-up, from which a store employee then identified that man as being the suspect.

Everyone involved will probably point fingers at each other, because the provider for example put large heading on their communication that, "this is not probable cause for an arrest, this is only an investigative lead, etc.", while the detectives will say well we got a hit from a line-up, blame the witness, and the witness would probably say well the detectives showed me a line-up and he seemed like the right guy (or maybe as is often the case with line-ups, the detectives can exert a huge amount of bias/influence over witnesses).

EDIT: Just to be clear, none of this is to say that the process worked well or that I condone this. I think the data, the technology, the processes, and the level of understanding on the side of the police are all insufficient, and I do not support how this played out, but I think it is easy enough to provide at least some pseudo-justification at each step along the way.

That's interesting. In many ways, it's similar to the "traditional" process I went through when reporting a robbery to the NYPD 5+ years ago: they had software where they could search for mugshots of all previously convicted felons living in a x-mile radius of the crime scene, filtered by the physical characteristics I described. Whether the actual suspect's face was found by the software, it was ultimately too slow and clunky to paginate through hundreds of results.

Presumably, the facial recognition software would provide an additional filter/sort. But at least in my situation, I could actually see how big the total pool of potential matches and thus have a sense of uncertainty about false positives, even if I were completely ignorant about the impact of false negatives (i.e. what if my suspect didn't live within x-miles of the scene, or wasn't a known/convicted felon?)

So the caution re: face recognition software is how it may non-transparently add confidence to this already very imperfect filtering process.

(in my case, the suspect was eventually found because he had committed a number of robberies, including being clearly caught on camera, and in an area/pattern that was easy to narrow down where he operated)

> and the detectives then put the photo of this man into a "6 pack" photo line-up, from which a store employee then identified that man as being the suspect.

This is absurdly dangerously. The AI will find people who look like the suspect, that’s how the technology works. A lineup as evidence will almost guarantee a bad outcome, because of course the man looks like the suspect!

  • The worse part is that the employee wasn't a witness to anything. He was making the "ID" from the same video the police had.

> Essentially, an employee of the facial recognition provider forwarded an "investigative lead" for the match they generated (which does have a score associated with it on the provider's side, but it's not clear if the score is clearly communicated to detectives as well)

This is the lead provided:

https://wfdd-live.s3.amazonaws.com/styles/story-full/s3/imag...

Note that it says in red and bold emphasis:

THIS DOCUMENT IS NOT A POSITIVE IDENTIFICATION. IT IS AN INVESTIGATIVE LEAD ONLY AND IS NOT PROBABLE CAUSE TO ARRREST. FURTHER INVESTIGATION IS NEEDED TO DEVELOP PROBABLE CAUSE TO ARREST.

  • Dear god the input image they used to generate that is TERRIBLE! It could be damn near any black male.

    The real negligence here is whoever tuned the software to spit out a result for that quality of image rather than a "not enough data, too many matches, please submit a better image" error.

    • I'm not even sure that's definitely a black man, rather than just any person with some kind of visor or mask. There does seem to be a face in the noise, but human brains are primed to see face shapes.

      The deeper reform that needs to happen here is that every person falsely arrested and/or prosecuted needs to be automatically compensated for their time wasted and other harm suffered. Only then will police departments have some incentive for restraint. Currently we have a perverse reverse lottery where if you're unlucky you just lose a day/month/year of your life. With the state of what we're actually protesting I'm not holding my breath (eg the privileged criminals who committed the first degree murder of Breonna Taylor still have yet to be charged), but it's still worth calling out the smaller injustices that criminal "justice" system inflicts.

      2 replies →

    • You're also looking at a scan of a small print out with poor contrast and brightness. There's probably a lot more detail there at full resolution, brightened up to show the face, and then enhanced contrast that the computer is seeing.

This is why you should be scared of this tech. Computer assisted patsy finder. No need to find the right guy when the ai will happily cough up 20 people nearby who kinda sorta look like the perp enough to stuff them into a lineup in front of a confused and highly fallible witness.

I'm becoming increasingly frustrated with the difficulty in accessing primary source material. Why don't any of these outlets post the surveillance video and let us decide for ourselves how much of a resemblance there is.

  • Even if the guy was an exact facial match, that doesn't justify the complete lack of basic police work to establish it was him.

    • Absolutely agree - and the consequences to a personal citizen for the lack of that basic police work can be long lastingly negative.

  • Because they're not in the business of providing information, transparency or journalism.

    They are in the business of exposing you to as many paid ads as possible. And they believe providing outgoing links reduces their ability to do that.

    • >They are in the business of exposing you to as many paid ads as possible.

      NPR is a non-profit that is mostly funded by donations. They only have minimal paid ads on their website to pay for running costs - they could easily optimize the news pages to increase ad revenue but they don't because it would get in the way of their goals.

I can see why you'd only get 6 guys together for a physical "6 pack" line-up.

But for a photo lineup I can't imagine why you don't have least 25 photos to pick from.

  • Excellent point. In fact, the entire process of showing the witness the photos should be recorded, and double blind. I.e the officer showing the person should not know anything about the lineup.

> the detectives then put the photo of this man into a "6 pack" photo line-up, from which a store employee then identified that man

This is not correct. The "6-pack" was shown to a security firm's employee, who had viewed the store camera's tape.

"In this case, however, according to the Detroit police report, investigators simply included Mr. Williams’s picture in a “6-pack photo lineup” they created and showed to Ms. Johnston, Shinola’s loss-prevention contractor, and she identified him." [1]

[1] ibid.

It wasn't just that the employee picked the man out of 6 pack; the employee they interviewed wasn't even a witness to the crime in the first place.

>into a "6 pack" photo line-up

How did the people in the 6 pack photo line-up match up against the facial recognition? Were they likely matches?

  • No clue about the likelihood of police using similar facial recognition matches for the rest, but normally the alternates need to be around the same height, build, and complexion as the subject. I would think including multiple potential matches would be a huge no-no simply because your alternates need to be people who you know are not a match. If you just grab the 6 most similar faces and ask the victim to choose, what do you do when they pick the third closest match?

    • Well you may know some people are not a match because you know where they were, for example pictures could be of people who were incarcerated at the time of the crime.

  • Even worse, the employee who was asked to pick him out of a line up hadn't even witnessed the crime in the first place.