Comment by bbarnett

7 days ago

This sort of discourse really grinds my gears. The framing of it, the conceptualization.

It's not creative at all, any more than taking the sum of text on a topic, and throwing a dart at it. It's a mild, short step beyond a weighted random, and certainly not capable of any real creativity.

Myriads of HN enthusiasts often chime in here "Are humans any more creative" and other blather. Well, that's a whataboutism, and doesn't detract from the fact that creative does not exist in the AI sphere.

I agree that you have to judge its output.

Also, sorry for hanging my comment here. Might seem over the top, but anytime I see 'creative' and 'AI', I have all sorts of dark thoughts. Dark, brooding thoughts with a sense of deep foreboding.

Point taken but if slushing up half of human knowledge and picking something to fit into the current context isn't creative then humans are rarely creative either.

> Well, that's a whataboutism, and doesn't detract from the fact that creative does not exist in the AI sphere.

Pointing out that your working definition excludes reality isn't whataboutism, it's pointing out an isolated demand for rigor.

If you cannot clearly articulate how human creativity (the only other type of creativity that exists) is not impugned by the definition you're using as evidence that creativity "does not exist in the AI sphere", you're not arguing from a place of knowledge. Your assertion is just as much sophistry as the people who assert it is creativity. Unlike them, however, you're having to argue against instances where it does appear creative.

For my own two cents, I don't claim to fully understand how human creativity emerges, but I am confident that all human creative works rest heavily on a foundation of the synthesis of author's previous experiences, both personal and of others' creative works - and often more heavily the latter. If your justification for a lack of creativity is that LLMs are merely synthesizing from previous works, then your argument falls flat.

  • I'll play with your tact in this argument, although I certain do not agree it is accurate.

    You're asserting that creativity is a meld of past experience, both personal and the creative output of others. Yet this really doesn't jive, as an LLM does not "experience" anything. I would argue that raw knowledge is not "experience" at all.

    We might compare this to the university graduate, head full of books and data jammed therein, and yet that exceptionally well versed graduate needs "experience" in a job for quite some time, before having any use.

    The same may be true of learning how to do anything, from driving, to riding a bike, or just being in conversations with others. Being told, on paper (or as part of your baked in, derived "knowledge store") things, means absolutely nothing in terms of actually experiencing them.

    Heck, just try to explain sex to someone before they've experienced it. No matter the literature, play, movie or act performed in front of them, experience is entirely different.

    And an AI does not experience the universe, nor is it driven by the myriad of human totality, from the mind o'lizard, to the flora/fauna in one's gut. There is no motive driving it, for example it does not strive to mate... something that drives all aspect of mammalian behaviour.

    So intertwined with the mating urge is human experience, that it is often said that all creativity derives from it. The sparrow dances, the worm wiggles, and the human scores 4 touchdowns in one game, thank you Al.

    Comparatively, an LLM does not reason, nor consider, nor ponder. It is "born" with full access to all of its memory store, has data spewed at it, searches, responds, and then dies. It is not capable of learning in any stream of consciousness. It does not have memory from one birth to the next, unless you feed its own output back at it. It can gain no knowledge, except from "context" assigned at birth.

    An LLM, essentially, understands nothing. It is not "considering" a reply. It's all math, top to bottom, all probability, taking all the raw info it has an just spewing what fits next best.

    That's not creative.

    Any more than Big Ben's gears and cogs are.

    • Experiences are not materially different from knowledge once they are both encoded as memories. They're both just encoded in neurons as weights in their network of connections. But let's assume there is some ineffable difference between firsthand and secondhand experience, which fundamentally distinguishes the two in the brain in the present.

      The core question here, then, is why you are so certain that "creativity" requires "experience" beyond knowledge, and why knowledge is insufficient? What insight do you have into the human mind that top neuroscientists lack that grants you this gnosticism on how creativity definitely does and does not work?

      Because, if you'll permit me to be crude, some of the best smut I've read has been by people I'm certain have never experienced the act. Their writing has been based solely on the writings of others. And yet, knowledge alone is more than enough for them to produce evocative creative works.

      And, to really hammer in a point (please forgive the insulting tone):

      >It's all math, top to bottom, all probability, taking all the raw info it has an just spewing what fits next best.

      You are just biology, top to bottom, just electrical signals, taking all the raw info your nerves get, matching patterns and just spewing what fits next best.

      Calling LLMs "just math" -- that's not creative, it's part of your input that you're predicting fits the likely next argument.

      You didn't "reason, consider, or ponder" whether I would find that argument convincing or be able to immediately dismiss it because it holds no weight.

      You're simply being a stochastic parrot, repeating the phrases you've heard.

      ...Etcetera. Again, apologies for the insult. But the point I am continually trying to make is that all of the arguments everyone tries to make about it not reasoning, not thinking, not having creativity -- they all are things that can and do apply to almost every human person, even intelligent and articulate ones like you or I.

      When it comes down to it, your fundamental argument is that you do not believe that a machine can possibly have the exceptional qualities of the human mind, for some ineffable reason. It's all reasoning backwards from there. Human creativity must require human-like experience, the ability to grow, and a growing context cannot possibly suffice, because you've already decided on your conclusion.

      (Because, perhaps, it would be too unsettling to admit that the alien facsimile of intelligence that we've created might have actual intelligence -- so you refuse that possibility)

      2 replies →

  • Agreed.

    "Whataboutism" is generally used to describe a more specific way of pointing out an isolated demand for rigor—specifically, answering an accusation of immoral misconduct with an accusation that the accuser is guilty of similar immoral misconduct. More broadly, "whataboutism" is a term for demands that morality be judged justly, by objective standards that apply equally to everyone, rather than by especially rigorous standards for a certain person or group. As with epistemic rigor, the great difficulty with inconsistent standards is that we can easily fall into the trap of applying unachievable standards to someone or some idea that we don't like.

    So it makes some sense to use the term "whataboutism" for pointing out an isolated demand for rigor in the epistemic space. It's a correct identification of the same self-serving cognitive bias that "whataboutism" targets in the space of ethical reasoning, just in a different sphere.

    There's the rhetorical problem that "whataboutism" is a derogatory term for demanding that everyone be judged by the same standards. Ultimately that makes it unpersuasive and even counterproductive, much like attacking someone with a racial slur—even if factually accurate, as long as the audience isn't racist, the racial slur serves only to tar the speaker with the taint of racism, rather than prejudicing the audience against its nominal target.

    In this specific case, if you concede that humans are no more creative than AIs, then it logically follows that either AIs are creative to some degree, or humans are not creative at all. To maintain the second, you must adopt a definition of "creativity" demanding enough to exclude all human activity, which is not in keeping with any established use of the term; you're using a private definition, greatly limiting the usefulness of your reasoning to others.

    And that is true even if the consequences of AIs being creative would be appalling.

I understand. I share the foreboding, but I try to subscribe to the converse of Hume's guillotine.