Comment by namaria

1 day ago

I have fed LLMs PDF files, asked about the content and gotten nonsense. I would be very hesitant to trust them to give me an accurate summary of my emails.

One of our managers uses Ai to summarize everything. Too bad it missed important caveats for an offer. Well, we burned an all nighters to correct the offer, but he did not read twenty pages but one...

  • I don't know if this is the case but be careful about shielding management from the consequences of their bad choices at your expense. It all but guarantees it will get worse.

    • Letting a thing implode that you could prevent is a missed opportunity for advancement and a risk to your career because you will be on a failing team.

      The smarter move is to figure out how to fix it for the company while getting visibility for it.

      9 replies →

  • Did he pull all nighters to fix it? If not, it wasn't "too bad" for him. I doubt he'll change his behavior.

  • Where's the IBM slide about "a machine cannot be held accountable, therefore a machine should never make a management decision"?

    Of course, often it's quite hard to hold management accountable either.

    • Isn't a solution to assign vicarious liability to whomever approves the use of the decision-making machine?