Comment by firefoxd

18 hours ago

Without even looking at the AI part, I have a single question: Did anybody investigate? That's it.

Whether it's AI that flagged her, or a witness who saw her, or her IP address appeared on the logs. Did anybody bothered to ask her "where were you the morning of july 10th between 3 and 4pm. But that's not what happened, they saw the data and said "we got her".

But this is the worst part of the story:

> And after her ordeal, she never plans to return to the state: “I’m just glad it’s over,” she told WDAY. “I’ll never go back to North Dakota.”

That's the lesson? Never go back to North Dakota. No, challenge the entire system. A few years back it was a kid accused of shoplifting [0]. Then a man dragged while his family was crying [1]. Unless we fight back, we are all guilty until cleared.

[0]: https://www.theregister.com/2021/05/29/apple_sis_lawsuit/

[1]: https://news.ycombinator.com/item?id=23628394

The thing about the legal system is there's no incentive to investigate to find the truth.

The incentive is to prosecte and prove the charges.

Speaking from the experience of being falsely accused after calling 911 to stop a drunk woman from driving.

The narrative they "investigated" was so obviously false, bodycam evidence directly contradicted multiple key facts. Officials are interested only seeking to prove the case. Thankfully the jury came to the right verdict.

  • There needs to be consequences for shitty, procedure-ignoring police work. Period.

    Minimum 1 year of jail time for grossly wrongful arrests that could be avoided with standard procedure or investigation tactics that were not applied.

    • I agree with this sentiment but when you start punishing this sort of thing you create more incentive to cover it up. It's a tricky problem and I'm not sure there's a perfect solution.

      What we really need is a change in police culture.

      3 replies →

    • These dialogs always prompt me to chime in with my solution: make the police be self-insured, backed by their pension fund.

      The police today have zero incentive to serve the public, they have zero skin in the game and can literally get away with murder.

      Any time you hear the call for "law and order", that is the audience that supports the current system, because they like it like this.

      2 replies →

  • > The narrative they "investigated" was so obviously false, bodycam evidence directly contradicted multiple key facts. Officials are interested only seeking to prove the case. Thankfully the jury came to the right verdict.

    I don't get it, if they only care about prosecuting and proving the case, wouldn't they go by the bodycam evidence? They didn't prove the case. Maybe if their incentive was to prosecute and prove the charges, they'd go by the obvious evidence. Or am I missing something here?

  • > The thing about the legal system is there's no incentive to investigate to find the truth.

    The truth is much more complicated and involves politics. For example Seattle (and possibly other cities?) enacted a law that involves paying damages for being wrong in the event of bringing certain types of charges. But that has resulted in some widely publicized examples where the prosecutor erred by being overly cautious.

    • And then you have Florida who will bill you about $100 a day for finding yourself in a Florida jail, regardless of whether charges were dismissed, you were found not guilty or any such thing.

      And to nobody’s surprise, failure to pay this bill is in itself a Class B felony…

      5 replies →

  • There’s a judge down in Texas, Dallas area I believe, who is in social media a lot because he will excoriate prosecutors who bring bs in to his court room. He’s not soft on crime but hard on rights and process. If a defendant did the wrong thing, he will have the appropriate amount of sympathy, down to zero. At times he will tell them, we all know you got lucky here, do better. But he won’t let prosecutors slate by on garbage charges or statements or investigations by police. Which leads to my primary point at least for this discussion in particular:

    To me the scariest part of this as a process is how many times (I’d casually estimate at least 75%) it is blindingly obvious that the prosecutor has not read the statement of charges or officer statements until everyone is in front of the judge. I get on one hand this judge seems to often be handling probable cause hearings but so many of these should never have resulted in any paperwork being turned in to the prosecution, let alone anyone having to show up in court.

  • There is an incentive . It’s called fraud by negligence. I’m hoping she sues everyone here.

    That’s seems to be in the realm of poissibility here if I am understanding things correctly (imo)

Yes, of course someone should have investigated, but the larger point here is that people don’t because they are being sold a false narrative that AI is infallible and can do anything.

We could sit here all day arguing “you should always validate the results”, but even on HN there are people loudly advocating that you don’t need to.

  • I don't think people on HN think "AI is infallible", I think people on HN believe HN is sufficient enough for "most tasks". In the context of HN "most tasks" refers to programming tasks, not arresting and jailing people tasks.

    You should always validate the results, but there is an inherint difference between an AI generated tool for personal use and a tool which could be used to destroy someones life.

    • What about cops and legislators? They thing AI is infallible and thats very convenient for them since they can thus not mandate cops having to double check tmwhat the AI suggests

    • The problem is that the people who will put this in place rate capability on a linear scale: in their view the ability to write software is sufficiently magic, so such an ability is obviously good enough to recognize criminals. From their perspective, there are hurdles to be crossed (like probable cause) and an AI flagging a suspect feels like a magical intelligence crossing those hurdles and allowing them to continue in the process.

      They don't validate the results of their fellow officers, or the validity of warrants, or anything else that predicates an arrest. Why would they start with this?

  • Where are you seeing people being told that AI is infallible? AI is being hyped to the moon, but "infallible" is not one of the claims.

    To the extent people trust AI to be infallible, it's just laziness and rapport (AI is rarely if ever rude without prompting, nor does it criticize extensive question-asking as many humans would, it's the quintessential enabler[1]) that causes people to assume that because it's useful and helpful for so many things, it'll be right about everything.

    The models all have disclaimers that state the inverse. People just gradually lose sight of that.

    [1] This might be the nature of LLMs, or it might be by design, similar to social media slop driving engagement. It's in AI companies' interest to have people buying subscriptions to talk with AIs more. If AI goes meta and critiques the user (except in more serious cases like harm to self or others, or specific kinds of cultural wrongthink), that's bad for business.

    • > Where are you seeing people being told that AI is infallible? AI is being hyped to the moon, but "infallible" is not one of the claims.

      I see all kinds of people being told that AI-based AI detection software used for detecting AI in writing is infallible!

      You want to make sure people aren't using fallible AI? Use our AI to detect AI? What could possibly go wrong.

      1 reply →

    • > To the extent people trust AI to be infallible, it's just laziness and rapport (…) that causes people to assume that because it's useful and helpful for so many things, it'll be right about everything.

      Why it happens is secondary to the fact that it does.

      > The models all have disclaimers that state the inverse. People just gradually lose sight of that.

      Those disclaimers are barely effective (if at all), and everyone knows that. Including the ones putting them there.

      https://www.youtube.com/watch?v=Xj4aRhHJOWU

Society went through the necessary lessons with DNA and fingerprints. Putting people in jail because the computer produce a match is a terrible idea, especially when its done by an proprietary dark box that no one really understand why it claims there is a match. It can be used as a tool of investigations to give the investigators an hint to find real more substantial clues, but using it like in fiction where the computer can act as the single truth is terrible for society and justice.

A month ago or so people on HN discussed facial recognition when looking victims and perpetrators in child exploitation material, and people were complaining that meta did not allow this fast enough. Neither the article or the people in that discussion draw any connection that the issues in this article could happen. People seemingly want to think that the lesson is "Never go back to North Dakota", as that is a much easier lesson than considering false positives in detection algorithms and their impact on a legal system that is constrained in budget, time, training and incentives.

I think you missed many important points.

"The trauma, loss of liberty, and reputational damage cannot be easily fixed,” Lipps' lawyers told CNN in an email.

That sounds a LOT like a statement you make for before suing for damages, not to mention they literally say "Her lawyers are exploring civil rights claims but have yet to file a lawsuit, they said."

This lady probably just wants to go back to normal life and get some money for the hell they put her in. She has never been on a airplane before, I doubt she is going to take on the entire system like you suggest. Easier said than done to "challenge the entire system", what does that even mean exactly?

  • It was worse than that, the reporting from an earlier story[0]

      ...Unable to pay her bills from jail, she lost her home, her car and even her dog.
    

    There is not a jury in the country that will side against the woman. I am not even sure who will make the best pop culture mashup - John Wick or a country song writer?

    (Also, what happened to journalism - no Oxford comma?)

    [0] https://news.ycombinator.com/item?id=47356968

  • The real problem here is she'll get money, who knows how much, but that ultimately does nothing to actually address the problems in the system.

    Effectively it just raises taxes to cover the cost of these failed prosecutions.

    Everytime one of these cases happens, a cop and a prosecutor should be out of a job permanently. Possibly even jailed. The false arrest should lose the cop their job and get them blacklisted, the prosecution should lose the prosecutor's right to practice law.

    And if the police union doesn't like that and decides to strike, every one of those cops should simply be fired. Much like we did to the ATC. We'd be better off hiring untrained civilians as cops than to keep propping up this system of warrior cops abusing the citizens.

    • > The false arrest should lose the cop their job and get them blacklisted

      There is actually a federal register for LEOs that have been terminated for cause or resigned to avoid termination.

      The police unions that operate in the jurisdictions that employ 70% of US police have negotiated into their CBAs that the register “cannot be used for hiring or promotional decisions”. Read into that what you will.

      3 replies →

>No, challenge the entire system.

Agree in principle. But people like her does not have the resources, financially and emotionally to go through the legal system again. Unless there are charitable lawyers who are willing to do it on her behalf for free.

IANAL but AFAIK custodial interrogation triggers Miranda, lawyers, and those awful awful civil liberties we’re trying to get rid of.

Better just to apply Musk or Altman software to the problem and avoid it entirely.

> Whether it's AI that flagged her

It absolutely was. There's no question of this. Now we need to ask how was the system marketed, what did the police pay for it, how were they trained to use it?

> anybody bothered to ask her "where were you the morning of july 10th between 3 and 4pm.

Legally that amounts "hearsay" and cannot have any value. Those statements probably won't even be admissible in court without other supporting facts entered in first.

> we are all guilty until cleared.

This is not at a phenomenon that started with AI. If you scratch the surface, even slightly, you'll find that this is a common strategy used against defendants who are perceived as not being financially or logistically capable of defending themselves.

We have a private prison industry. The line between these two outcomes is very short.

  • >Legally that amounts "hearsay" and cannot have any value.

    How is that hearsay if she's directly testifying to her own whereabouts?

    Hearsay would be if someone else was testifying "she was in X location on july 10th between 3 and 4pm", without the accused being available for cross

    • No!

      "I was at the library" is firsthand testimony.

      "I saw her at the library" is firsthand testimony.

      "I saw her library card in her pocket" is firsthand testimony.

      "She was at the library - Bob told me so" is hearsay. Just look at the word - "hear say". Hearsay is testifying about events where your knowledge does not come from your own firsthand observations of the event itself.

  • > Legally that amounts "hearsay" and cannot have any value. Those statements probably won't even be admissible in court without other supporting facts entered in first.

    I just want to understand your argument: you believe that any alibi provided is hearsay, and has no legal value, and that they can't even take the statement in order to validate it? That's your position?

    • The condition here being she was already arrested. You don't arrest someone first and then try to establish their alibi second. That would be an investigation which would be prior to getting a warrant which would allow you to arrest someone. You will never talk yourself out of an arrest, you might talk yourself out of an investigation.

      You can offer your story to the police but the fact that you did or what you said to them will not come into evidence in court. You cannot call the officer to the stand and then ask them to repeat in court what you said. That would be "hearsay." So, for a lot of reasons, if you're already arrested, you probably don't even want to tell them any of that. It can only be used against you and never for you. Get your lawyer and have them ready the case to prove that alibi for you.