Comment by naasking

4 days ago

You're just assuming that mimicry of a thing is not equivalent to the thing itself. This isn't true of physical systems (simulated water doesn't get you wet!) but it is true of information systems (simulated intelligence is intelligence!).

> You're just assuming that mimicry of a thing is not equivalent to the thing itself.

I'm not assuming that, that's literally the definition of mimicry: to imitate closely.

You might say I'm assuming that it is mimicking and not actually thinking, but there's no evidence it's actually thinking, and we know exactly what is IS doing because we created the code that we used to build the model. They're not thinking, it's doing math, mathematical transformations of data.

  • > They're not thinking, it's doing math, mathematical transformations of data

    Whatever thinking fundamentally is, it also has an equivalence as a mathematical transformation of data. You're assuming the conclusion by saying that the two mathematical transformations of data are not isomorphic.

    A simulation of information processing is still information processing, just like running Windows in a QEMU VM is still running Windows.

    • > Whatever thinking fundamentally is, it also has an equivalence as a mathematical transformation of data.

      Do not confuse the mathematical description of physical processes as the world being made of math.

      > You're assuming the conclusion by saying that the two mathematical transformations of data are not isomorphic.

      Correct. They're not isomorphic. One is simple math that runs on electrified sand, and one is an unknown process that developed independently across a billion years. Nothing we're doing with AI today is even close to real thought. There are a billion trivial proofs that make the rounds as memes, like one R in strawberry, or being unable to count, etc.

      3 replies →

But a simulated mind is not a mind. This was already debated years ago with the aid of the Chinese Room thought experiment.

  • The Chinese Room experiment applies equally well to our own brains - in which neuron does the "thinking" reside exactly? Searle's argument has been successfully argued against in many different ways. At the end of the day - you're either a closet dualist like Searle, or if you have a more scientific view and are a physicalist (i.e. brains are made of atoms etc. and brains are sufficient for consciousness / minds) you are in the same situation as the Chinese Room: things broken down into tissues, neurons, molecules, atoms. Which atom knows Chinese?

    • The whole point of this experiment was to show that if we don't know whether something is a mind, we shouldn't assume it is and that our intuition in this regard is weak.

      I know I am a mind inside a body, but I'm not sure about anyone else. The easiest explanation is that most of the people are like that as well, considering we're the same species and I'm not special. You'll have to take my word on that, as my only proof for this is that I refuse to be seen as anything else.

      In any case LLMs most likely are not minds due to the simple fact that most of their internal state is static. What looks like thoughtful replies is just the statistically most likely combination of words looking like language based on a function with a huge number of parameters. There's no way for this construct to grow as well as to wither - something we know minds definitely do. All they know is a sequence of symbols they've received and how that maps to an output. It cannot develop itself in any way and is taught using a wholly separate process.

      4 replies →

  • > But a simulated mind is not a mind. This was already debated years ago with the aid of the Chinese Room thought experiment.

    Yes, debated and refuted. There are many well known and accepted rebuttals of the Chinese Room. The Chinese Room as a whole does understand Chinese.

  • > But a simulated mind is not a mind.

    How would the mind know which one it is?

    Maybe your mind is being simulated right now.

    • > How would the mind know which one it is?

      I'm not assuming it is without hard proof - that's my only argument.

      > Maybe your mind is being simulated right now.

      I'm experiencing consciousness right now, so that would have to be a damn good simulation.