← Back to context

Comment by sillyfluke

10 hours ago

>To be fair, I don't think it is an AI problem, more of a quirk of formal communication, the same happen with human secretaries.

Obviously you're not a golfer. Human sectretaries don't have non-deterministic hallucinations and random critical ommissions in their summaries, which I've witnessed first hand with LLMs. More importantly, if they do you have more deterministic mitigations with them than you do with LLMs, as there are no mitigations with LLMs except praying that a new model in some unspecified future will be magically better with the summaries down the line.

The only way to stay sane when using these tools is to pretend that these things won't ever happen and just go about your business like the rest of the zombie workforce, because no one wants to stop the train and address the issue.

There is a reason why the title of Dr.Strangelove is "How I Learned to Stop Worrying and Love the Bomb".

To make things clear, even though I am not a golfer, I have a lot of respect for good secretaries, that's why I said they can do much more than write formal letters. Not only they don't hallucinate like LLMs but they can actually catch mistakes before they happen.

I just wanted to point out the seemingly ridiculous idea of formal communication that none of the interested parties actually read. It is as if two English speakers insisted for the discussion to be in Spanish, and they both brought an interpreter for that.

  • I think I understand that, which is why I specifically singled out your suggestion that "AI is not the problem" instead of your comment as a whole. Automated processes of Person A talking to automated processes of Person B is not a novel concept. It's been going on in various forms since computers have been invented. You call an Uber and get your phone to talk to Uber's servers and the driver's phone so you don't have to talk to the driver, same with Doordash and the restaurant, same with Amazon and the retail store personnel. In these cases the back and forth between two humans are, as in your words, a type of "formal communication that none of the interested parties actually read" or "two English speakers both bringing Spanish interpreters to talk to each other."

    It's this way because us nerds want to remove the need to talk to other humans in order obtain what our hearts' desire at any given point. So we have been trying to remove non-deterministic roadblocks (humans) and replace them with deterministic automation from the very beginning. This is not new. If LLMs were the same deterministic processes but 1000x more versatile and capable then it wouldn't be a problem.

    But even though LLMs have lowered the barrier on who can create these automated processes and the speed at which these automated processes are created, to achieve this they brought with them non-deterministic side effects that currently evade a holistic and deterministic fix. This is why it's an AI problem.