Comment by lpapez

1 day ago

Counterpoint: show me a human who can reliably produce 100% accurate notes.

Seriously, I wish to hire this person.

Seriously, do people around you not normally double check, proofread, review what they turn in as done work?

Maybe I am just very fortunate, but people who are not capable of producing documents that are factually correct do not get to keep producing documents in the organizations I have worked with.

I am not talking about typos, misspelling words, bad formatting. I am talking about factual content. Because LLMs can actually produce 100% correct text but they routinely mangle factual content in a way that I have never had the misfortune of finding in the work of my colleagues and teams around us.

  • A friend of mine asked an AI for a summary of a pending Supreme Court case. It came back with the decision, majority arguments, dissent, the whole deal. Only problem was that the case hadn't happened yet. It had made up the whole thing, and admitted that when called on it.

    A human law clerk could make a mistake, like "Oh, I thought you said 'US v. Wilson,' not 'US v. Watson.'" But a human wouldn't just make up a case out of whole cloth, complete with pages of details.

    So it seems to me that AI mistakes will be unlike the human mistakes that we're accustomed to and good at spotting from eons of practice. That may make them harder to catch.

    • I think it is more like the clerk would say "There never was a US vs Wilson" (well there probably was given how common that name is, but work with me). The AI doesn't have a concept of maybe I misunderstood the question. AI would likely give you a good summary if the case happened, but if it didn't it makes up a case.

      3 replies →

  • What are the odds that the comment you're responding to was AI-generated?

    • Good question. So far comments here mostly seem to be human generated, but I would be surprised if there were no AI generated ones. It is also possible to fool me. I'm going with - for now - the default that it was not AI.

You are mixing up notes and full blown transcript of the meeting. The latter is impossible to produce by the untrained humans. The former is relatively easy for a person paying attention, because it is usually 5 to 10 short lines per an hour long meeting, with action items or links. Also in a usual work meeting, a person taking notes has possibility to simply say "wait a minute, I will write this down" and this does happens in practice. Short notes made like that usually are accurate in the meaning, with maybe some minor typos not affecting accuracy.