Comment by britch

2 years ago

The problem with LLM in my view is that they're capped at what already exists.

Using them for "creative" things, is that they can parrot things back in the statistically average way, or maybe attempt to echo it in an existing style.

Copilot cannot use something because it prefers it, or thinks it's better than what's common. It can only repeat what is currently popular (and will likely be self reenforced over time)

When you write prose or code you develop preferences and opinions. "Everyone does it this way, but I think X is important."

You can take your learning and create a new language or framework based on your experiences and opinions working in another.

You develop your own writing style.

LLM cuts out this chance to develop.

---

Images, prose, (maybe) code are not the result of computation.

Two different people compute the same thing they get the same answer. When I ask different people to write the same thing I get wildly different answers.

Sure ChatGPT may give different answers, but they will always be in the ChatGPT style (or parroting the style of an existing someone).

"ChatGPT will get started and I'll edit my voice into what it generated" is not how writing works.

It's difficult for me to see how a world where people are communicating back and forth with the most statistically likely manner is good

All artists of every stripe have studied other art, have practiced what has come before, and have influences. What do you think they do in art school; they copy what came before. The old masters had understudies, that learned a style. Is it not an old saying in art that ‘there is nothing original’. Everything was based on something.

Humans are also regurgitating what they ‘inputted’ to their brain. For programming, isn’t it an old joke that everyone just copy/paste's from stack overflow?

Why if an AI does it (copy paste), it is somehow now a lesser accomplishment than when a human does it.

  • > Why if an AI does it (copy paste), it is somehow now a lesser accomplishment than when a human does it.

    Because the kind of 'art' the AI will create will end up in a Canva template; it will be clip art for the modern Powerpoint or Facebook ad. Because corporations like Canva are the only ones that will pay the fees to use these tools at scale. And all they produce is marketing detritus, which is the opposite of art.

    Instead of the "Corporate Memphis" art style that's been run into the ground by every big tech company, AI will produce similarly bland, corporate-approved graphics that we'll continue to roll our eyes at.

  • It's a fair point.

    My concern is with the limitations in the creation of new styles.

    I guess my view is that you send 100 people to art school and you get 100 different styles out of it (ok maybe 80).

    With AI you've got a handful of dominant models instead of a unique model for each person based on life experience.

    Apprentices learn and develop into a master. If that works is all moved to an LLM, where do the new masters come from?

    ---

    I take your point about the technology. I have a hard time saying it's not impressive or similar to how humans learn.

    My concern is more with what widespread adoption will mean

The style can be influenced, however. It isn't unreasonable to suggest an AI that fine tunes the style of the LLM output to meet whatever metric you're after.

As far as creativity goes, human creativity is also a product of life experiences. Artistic styles are always influenced by others, etc.