Comment by mikemarsh

6 months ago

The idea of replicating a consciousness/intelligence in a computer seems to fall apart even under materialist/atheist assumptions: what we experience as consciousness is a product of a vast number of biological systems, not just neurons firing or words spoken/thought. Even considering something as basic as how fundamental bodily movement is to mental development, or how hormones influence mood ultimately influencing thought, how could anyone ever hope to to replicate such things via software in a way that "clicks" to add up to consciousness?

Conflating consciousness and intelligence is going to hopelessly confuse any attempt to understand if or when a machine might achieve either.

(I think there's no reasonable definition of intelligence under which LLMs don't possess some, setting aside arguments about quantity. Whether they have or in principle could have any form of consciousness is much more mysterious -- how would we tell?)

  • Defining machine consciousness is indeed mysterious, at the end of the day it ultimately depends on how much faith one puts in science fiction rather than an objective measure.

    • Seems like a philosophy question, with maybe some input from neuroscience and ML interpretability. I'm not sure what faith in science fiction has to do with it.

I don't see a strong argument here. Are you saying there is a level of complexity involved in biological systems that can not be simulated? And if so, who says sufficient approximations and abstractions aren't enough to simulate the emergent behavior of said systems?

We can simulate weather (poorly) without modeling every hydrogen atom interaction.

  • The argument is about causation or generation, not simulation. Of course we can simulate just about anything, I could write a program that just prints "Hello, I'm a conscious being!" instead of "Hello, World!".

    The weather example is a good one: you can run a program that simulates the weather in the same way my program above (and LLMs in general) simulate consciousness, but no one would say the program is _causing_ weather in any sense.

    Of course, it's entirely possible that more and more people will be convinced AI is generating consciousness, especially when tricks like voice or video chat with the models are employed, but that doesn't mean that the machine is actually conscious in the same way a human body empirically already is.

    • >but that doesn't mean that the machine is actually conscious in the same way a human body empirically

      Does it matter? Is a dog/cow/bird/lizard conscious in the same way a human is? We're built from the same basic parts, and yet humans seem to have a higher state of consciousness than other animals around us.

      For example the definition of the word conscious is

      >aware of and responding to one's surroundings; awake.

      I'll give that we likely mean this in a general sense, but I'd say we're pretty close to this with machines. They can observe the real world with sensors of different types, and then either directly compute, or use neural nets to make generalized decisions on what is occurring around them, then proceed to act on those observations.

  • If you simulate rainy weather, does anything get wet?

    (Not my original quote, but can't remember right now where I read it.)

    It's similar asking about whether silicon computers performing intelligent tasks is "conscious".

  • I guess it depends, can you tell the difference between a weather simulation and the actual world?

    • Can you?

      You have weather readouts. One set is from a weather simulation - a simulated planet with simulated climate. Another is real recordings from the same place at the same planet, taken by real weather monitoring probes. They have the same starting point, but diverge over time.

      Which one is real though? Would you be able tell?

      9 replies →

Bundling up consciousness with intelligence is a big assumption, as is the assumption that panpsychism is incorrect. You may be right on both counts, but you can't just make those two assumptions as a foregone conclusion.

AGI won't replicate our experience.

But it could be more powerful than us.

  • Honestly replicating our experience would be rather wasteful. Much like making planes the same way birds work to carry cargo.