Comment by pmarreck

4 days ago

So it seems to be a semantics argument. We don't have a name for a thing that is "useful in many of the same ways 'thinking' is, except not actually consciously thinking"

I propose calling it "thunking"

I don't like it for a permanent solution, but "synthetic thought" might make a good enough placeholder until we figure this out. It feels most important to differentiate because I believe some parties have a personal interest in purposely confusing human thought with whatever LLMs are doing right now.

  • This is complete nonsense.

    If you do math in your head or math with a pencil/paper or math with a pocket calculator or with a spreadsheet or in a programming language, it is all the same thing.

    The only difference with LLMs is the anthropomorphization of the tool.

  • agreed.

    also, sorry but you (fellow) nerds are terrible at naming.

    while "thunking" possibly name-collides with "thunks" from CS, the key is that it is memorable, 2 syllables, a bit whimsical and just different enough to both indicate its source meaning as well as some possible unstated difference. Plus it reminds me of "clunky" which is exactly what it is - "clunky thinking" aka "thunking".

    And frankly, the idea it's naming is far bigger than what a "thunk" is in CS

They moved goalposts. Linux and worms think too, the question is how smart are they. And if you assume consciousness has no manifestation even in case of humans, caring about it is pointless too.

  • What does it mean to assume consciousness has no manifestation even in the case of humans? Is that denying that we have an experience of sensation like colors, sounds, or that we experience dreaming, memories, inner dialog, etc?

    That's prima facie absurd on the face of it, so I don't know what it means. You would have to a philosophical zombie to make such an argument.

  • Yes, worms think, let the computers have thinking too, the philosophers can still argue all they want about consciousness.

    Humans are special, we emit meaning the way stars emit photons, we are rare in the universe as far as empirical observation has revealed. Even with AGI the existence of each complex meaning generator will be a cosmic rarity.

    For some people that seems to be not enough, due to their factually wrong word views they see themselves as common and worthless (when they empirically aren't) and need this little psychological boost of unexaminable metaphysical superiority.

    But there is an issue of course, the type of thinking humans do is dangerous but net positive and relatively stable, we have a long history where most instantiations of humans can persist and grow themselves and the species as a whole, we have a track record.

    These new models do not, people have brains that as they stop functioning they stop persisting the apparatus that supports the brain and they die, people tend to become less capable and active as their thinking deteriorates and hold less influence ocer others accept in rare cases.

    This is not the case for an LLM, they seem to be able to hallucinate endlessly and as they have access to the outside world maintain roughly their same amount of causal leverage, their clarity and accuracy of their thinking isn't tied to their persisting.

    • Are we that special? We may be the only species left on Earth that's built civilization, but there are other species on Earth that we've deemed sentient, even if they don't have smartphones. (That may argue that they're smarter than us though.) If octopodes can dream, if elephants get depressed when their spouse dies, then I'd we're not so totally alone on our own planet, then it seems, despite no evidence, that we can't be totally alone in the universe. That is for philosophy professors to ponder Drakes equation until we have irrefutable evidence, however.

      1 reply →