Comment by eterevsky
8 hours ago
I think whether any text is written with the help of AI is not the main issue. The real issue is that for texts like police reports a human still has to take full responsibility for its contents. If we preserve this understanding, than the question of which texts are generated by AI becomes moot.
Sadly justice system is a place where responsibility does not happen. It is not a system where you make one mistake and you are to prison. Instead everyone but the victims of the system are protected and colluded with. More you punish the victims better you make out.
By the end of your paragraph, you decided that criminals are victims.
Everyone the policing system interacts with is not only a convinced criminal, but convicted justly, yeah? That's what you actually believe?
2 replies →
Yes. Allowing officers to blame AI creates a major accountability gap. Per e.g. the EU AI Act’s logic, if a human "edits" a draft, they must be held responsible and do not need to disclose the use of AI.
To ensure safety, those offerings must use premarket red teaming to eliminate biases in summarization. However, ethical safety also requires post-market monitoring, which is impossible if logs aren't preserved. Rather than focusing on individual cases, I think, we must demand systemic oversight in general and access for independent research (not only focussing on a specific technology)
It should be treated kind of the same as writing a report after a glass of wine. Probably no one really cares but "sorry that doesn't count because I was intoxicated when I wrote that bit" isn't going to fly.
I don’t understand the urgency to replace human work with AI. Why is every organization so eager about skipping the AI as an assistant step? Here there are already massive productivity gains in using the AI to create the draft of the report, it makes little economical to make it do the final version compared to the risk, maybe it’s just plain laziness? Same with developers, why is very organization wanting to leapfrog from humans write all the code to they don’t even read the generated code?
Not everyone is in an urgent hurry to replace people with bots; that's a hyperbolic construct.
But to try to answer some of what I think you're trying to ask about: The bot can be useful. It can be better at writing a coherent collection of paragraphs or subroutines than Alice or Bill might be, and it costs a lot less to employ than either of them do.
Meanwhile: The bot never complains to HR because someone looked at them sideways. The bot [almost!] never calls in sick; the bot can work nearly 24/7. The bot never slips and falls in the parking lot. The bot never promises to be on-duty while they vacation out-of-state with a VPN or uses a mouse-jiggler to screw up the metrics while they sleep off last night's bender.
The bot mostly just follows instructions.
There's lots of things the bot doesn't get right. Like, the stuff it produces may be full of hallucinations and false conclusions that need reviewed, corrected, and outright excised.
But there's lots of Bills and Alices in the world who are even worse, and the bot is a lot easier and cheaper to deal with than they are.
That said: When it comes to legal matters that put a real person's life and freedom in jeopardy, then there should be no bot involved.
If a person in a position of power (such as a police officer) can't write a meaningful and coherent report on their own, then I might suggest that this person shouldn't ever have a job where producing written reports are a part of their job. There's probably something else they're good at that they can do instead (the world needs ditchdiggers, too).
Neither the presence nor absence of a bot can save the rest of us from the impact of their illiteracy.
and bot doesnt bare any responsibility
Because the biggest cost at a lot of orgs is staff. Your typical software shop will be comical—the salary costs towering down on all the others like LeBron James gazing down at ants. The moment you go from productivity gains to staff reduction you start making real money. Any amount of money for a machine that can fully replace a human process.
I agree. A programmer has to take responsibility for the generated code they push, and so do police officers for the reports they file. Using a keyboard does not absolve you of typos, it's your responsibility to proofread and correct, this is no different, just a lot more advanced.
Of course the problem is also that police often operates without any real oversight and covers up more misconduct than workers in an under-rug sweeping factory. But that's another issue.