← Back to context

Comment by yhoiseth

17 hours ago

Sarkar argues that “AI shaming arises from a class anxiety induced in middle class knowledge workers, and is a form of boundary work to maintain class solidarity and limit mobility into knowledge work.”

I think there is at least some truth to this.

Another possible cause of AI shaming is that reading AI writing feels like a waste of time. If the author didn’t bother to write it, why should I bother to read it and provide feedback?

This latter piece is something I am struggling with.

I have spent 10+ years working on teams that are primarily composed of people whose first language is not English in workplaces where English is the mandated business language. Ever since the earliest LLMs started appearing, the written communication of non-native speakers has become a lot clearer from a grammatical point of view, but also a lot more florid and pretentious than they actually intended to be. This is really annoying to read because you need to mentally decode the LLM-ness of their comments/messages/etc back into normal English, which ends up costing more cognitive overhead than it used to reading their more blunt and/or broken English. But, from their perspective, I also understand that it saves them cognitive effort to just type a vague notion into an LLM and ask for a block of well-formed English.

So, in some way, this fantastic tool for translation is resulting in worse communication than we had before. It's strange and I'm not sure how to solve it. I suppose we could use an LLM on the receiving end to decode the rambling mess the LLM on the sending end produced? This future sucks.

  • It is normal, you add a layer between the two brains that communicate, and that layer only add statistical experience to the message.

    I write letters to my gf, in English, while English is not our first language. I would never ever put an LLM between us: this would fall flat, remove who we are, be a mess of cultural references, it would just not be interesting to read, even if maybe we could make it sound more native, in the style of Barack Obama or Prince Charles...

    LLMs are going to make people as dumb as GPS made them. Except really, when reading a map was not very useful a skill, writing what you feel... should be.

  • I thought about this too. I think the solution is to send both, prompt and output - since the output was selected itself by the human between potentially multiple variants.

    Prompt: I want to tell you X

    AI: Dear sir, as per our previous discussion let's delve into the item at hand...

If knowledge work doesn't require knowledge then is it knowledge work?

The main issue that is symptomatic to current AI is that without knowledge (at least at some level) you can't validate the output of AI.

> why should I bother to read it and provide feedback?

I like to discuss a topic with a LLM and generate an article at the end. It is more structured and better worded but still reflects my own ideas. I only post these articles in a private blog I don't pass them as my own writing. But I find this exercise useful to me because I use LLMs as a brainstorming and idea-debugging space.

> If the author didn’t bother to write it, why should I bother to read it

There is an argument that luxury stuff is valuable because typically it's hand made, and in a sense, what you are buying is not the item itself, but the untold hours "wasted" creating that item for your own exclusive use. In a sense "renting a slave" - you have control over another human's time, and this is a power trip.

You have expressed it perfectly: "I don't care about the writing itself, I care about how much effort a human put into it"

  • If effort wasn’t put into it, then the writing cannot be good, except by accident or theft or else it is not your writing.

    If you want to court me, don’t ask Cyrano de Bergerac to write poetry and pass it off as your own.

    • > If effort wasn’t put into it, then the writing cannot be good

      This is what people used to say about photography versus painting.

      > pass it off as your own.

      This is misleading/fraud and a separate subject than the quality of the writing.