Comment by maxbond
11 days ago
No one said it was defensible. They drew a distinction between incompetence and malice. Let's not misquote each other here in the comments.
11 days ago
No one said it was defensible. They drew a distinction between incompetence and malice. Let's not misquote each other here in the comments.
Even if it didn't fabricate quotes wholesale, taking an LLM's output and claiming it as your own writing is textbook plagiarism, which is malicious intent. Then, if you know that LLMs are next-token-prediction-engines that have no concept of "truth" and are programmed solely to generate probabilistically-likely text with no specific mechanism of anchoring to "reality" or "facts", and you use that output in a journal that (ostensibly) exists for the reason of presenting factual information to readers, you are engaging in a second layer of malicious intent. It would take an astounding level of incompetence for a tech journal writer to not be aware of the fact that LLMs do not generate factual output reliably, and it beggars belief given that one of the authors has worked at Ars for 14 years. If they are that incompetent, they should probably be fired on that basis anyways. But even if they are that incompetent, that still only covers one half of their malicious intent.
The article in question appears to me to be written by a human (excluding what's in quotation marks), but of course neither of us has a crystal ball. Are there particular parts of it that you would flag as generated?
Honestly I'm just not astounded by that level of incompetence. I'm not saying I'm impressed or that's it's okay. But I've heard much worse stories of journalistic malpractice. It's a topical, disposable article. Again, that doesn't justify anything, but it doesn't surprise me that a short summary of a series of forum exchanges and blog posts was low effort.
I don't believe there is any greater journalistic malpractice than fabrication. Sure, there are worse cases of such malpractice in the world given the low importance of the topic, but journalists should be reporting the truth on anything they deem important enough to write about. Cutting corners on the truth, of all things, is the greatest dereliction of their duty, and undermines trust in journalism altogether, which in turn undermines our collective society as we no longer work from a shared understanding of reality owing to our inability to trust people who report on it. I've observed that journalists tend to have unbelievably inflated egos and tout themselves as the fourth estate that upholds all of free society, and yet their behaviour does not actually comport with that and is rather actively detrimental in the modern era.
I also do not believe this was a genuine result of incompetence. I entertained that it is possible, but that would be the most charitable view possible, and I don't think the benefit of doubt is earned in this case. They routinely cover LLM stories, the retracted article being about that very subject matter, so I have very little reason to believe they are ignorant about LLM hallucinations. If it were a political journalist or something, I would be more inclined to give the ignorance defense credit, but as it is we have every reason to believe they know what LLMs are and still acted with intention, completely disregarding the duty they owe to their readers to report facts.
3 replies →
This is silly. LLMs are not people; you can’t “plagiarize” an LLM. Either the result is good or it isn’t, but it’s the actual author’s responsibility either way.