← Back to context

Comment by pyman

8 months ago

Didn't our parents go through the same thing when email came out?

My dad used to say: "Stop sending me emails. It's not the same." I'd tell him, "It's better. "No, it's not. People used to sit down and take the time to write a letter, in their own handwriting. Every letter had its own personality, even its own smell. And you had to walk to the post office to send it. Now sending a letter means nothing."

Change is inevitable. Most people just won't like it.

A lot of people don't realise that Transformers were originally designed to translate text between languages. Which, in a way, is just another way of improving how we communicate ideas. Right now, I see two things people are not happy about when it comes to LLMs:

1. The message you sent doesn't feel personal. It reads like something written by a machine, and I struggle to connect with someone who sends me messages like that.

2. People who don't speak English very well are now sending me perfectly written messages with solid arguments. And honestly, my ego doest’t like it because I used to think I was more intelligent than them. Turns out I wasn't. It was just my perception, based on the fact that I speak the language natively.

Both of these things won't matter anymore in the next two or three years.

I really don’t think they’re the same thing. Email or letter, the words are yours while an LLM output isn’t.

  • Initially, it had the same effect on people until they got used to it. In the near future, whether the text is yours or not won't matter. What will matter is the message or idea you're communicating. Just like today, it doesn't matter if the code is yours, only the product you're shipping and problem it's solving.

    • Code is either fit for a given purpose or not. Communicating with a LLM instead of directly with the desired recipient may be considered fit for purpose for the receiving party, but it’s not for the LLM user to say what the goals of the writer is, nor is it for the LLM user to say what the goals of the writer ought to be. LLMs for communication are inherently unfit for purpose for anything beyond basic yes/no and basic autocomplete. Otherwise I’m not even interacting with a human in the loop except before they hit send, which doesn’t inspire confidence.

    • I think just looking at information transfer misses the picture. What's going to happen is that my Siri is going to talk to your Cortana, and that our digital secretaries will either think we're fit to meet or we're not. Like real secretaries do.

      You largely won't know such conversations are happening.

    • Similar-looking effects are not the "same" effect.

      "Change always triggers backlash" does not imply "all backlash is unwarranted."

      > What will matter is the message or idea you're communicating. Just like today, it doesn't matter if the code is yours, only the product you're shipping and problem it's solving.

      But like the article explains about why it's rude: the less thought you put into it, the less chance the message is well communicated. The less thought you put into the code you ship, the less chance it will solve the problem reliably and consistently.

      You aren't replying to "don't use LLM tools" you're replying to "don't just trust and forward their slop blindly."

    • Doesn't matter today? What are you even talking about? It completely matters if the code you write is yours. The only people saying otherwise have fallen prey to the cult of slop.

      9 replies →

  • That is indeed the crux of it. If you write me an inane email, it’s still you, and it tells me something about you. If you send me the output of some AI, have I learned anything? Has anything been communicated? I simply can’t know. It reminds me a bit of the classic philosophical thought experiment "If a tree falls in a forest and no one is around to hear it, does it make a sound?" Hence the waste of time the author alludes to. The only comparison to email that makes any sense in this case are the senseless chain mails people used to forward endlessly. They have that same quality.

  • Some do put their words into the LLM and clean it up.

    And it stays much closer to how they are writing.

  • Which words, exactly, are "yours"? Working with an LLM is like having a copywriter 24/7, who will steer you toward whatever voice and style you want. Candidly, I'm getting the sense the issue here is some junior varsity level LLM skill.

I can see the similarity yes! Although I do feel like the distance between handwritten letter and an email is shorter than between email and LLM generated email. There's some line it crossed. Maybe it's that email provided some benefit to the reader too. Yes, there's less character, but you receive it faster, you can easily save it, copy it, attach a link or a picture. You may even get lucky and receive an .exe file as a bonus! LLM does not provide any benefit for the reader though, it just wastes their resources on yapping that no human cared to write.

Just be a robot. Sell your voice to the AI overlords. Sell your ears and eyes. Reality was the scam; choose the Matrix. I choose the Matrix!

Same thing with photography and painting. These opinionated pieces display a false dichotomy which propagates into argument, when we have a tunable dial rather than a switch, appropriately increasing or decreasing our consideration, time, and focus along a spectrum rather than treating it as an on and off switch.

I value letters far more than emails, pouring out my heart and complex thought to justify the post office trip and even postage stamp. Heck, why do we write birthday cards instead of emails? I hold a similar attitude towards LLM output and writing; perhaps more analogous is a comparison between painting and photography. I’ll take a glance at LLM output, but reading intentional thought (especially if it’s a letter) is when I infer about the sender as a person through their content. So if you want to send me a snapshot or fact, I’m fine with LLM output, but if you’re painting me a message, your actionable brushstrokes are more telling than the photo itself.

Letters had a time and potential money cost to send. And most letters don't need to be personalized to the point where we need handwriting to justify them.

>Change is inevitable. Most people just won't like it.

people love saying this and never taking the time to consider if the change is good or bad. Change for change's sake is called chaos. I don't think chaos is inevitable.

>And honestly, my ego doest’t like it because I used to think I was more intelligent than them. Turns out I wasn't. It was just my perception, based on the fact that I speak the language natively.

I don't think I ever heard that argument until now. And to be frank that argument says more about the arguer than the subject or LLM's.

Have you simply considered 3) LLM's don't have context and can output wrong information? If you're spending more time correcting the machine than communicating, we're just adding more beauracracy to the mix.

One thing is it's less about change. It's more about quality vs quantity and both have their place.

I mean that's fine, but the right response isn't all this moral negotiation, but rather just to point out that it's not hard to have Siri respond to things.

So have your Siri talk to my Cortana and we'll work things out.

Is this a colder world or old people just not understanding the future?

  • It's demonstration by absurdity that that is not the future. You're describing the collapse of all value.