← Back to context

Comment by pavas

6 months ago

> As noted, consciousness seems to just be the ability to self-observe, which is useful as another predictive input.

As far as I know, consciousness is referring to something other than self-referential systems, especially with regards to the hard problem of consciousness.

The [philosophical zombie](https://en.wikipedia.org/wiki/Philosophical_zombie) thought experiment is well-known for imagining something with all the structural characteristics that you mention but without conscious experience as in "what-its-like" to be someone.

It seems entirely possible that the "philosophical zombie" is an impossible/illogical construct, and that in fact anything with all the structure necessary for consciousness will of necessity be conscious.

When considering the structural underpinnings of consciousness, it's interesting to note the phenomena of "blindsight", which is essentially a loss of visual consciousness without an actual loss of vision!

Note that anything with mental access to it's own deliberations and sensory inputs will by definition always be able to report "what it's like" to be themselves - what they are experiencing (what's in their mind). If something reports to you their quales of vision or hearing, isn't this exactly what we mean by "what it's like" to be them - how they feel they are experiencing the world?!

  • > It seems entirely possible that the "philosophical zombie" is an impossible/illogical construct, and that in fact anything with all the structure necessary for consciousness will of necessity be conscious.

    Yes, and that's pretty much exactly the point: we don't know of any way of determining whether someone is a p-zombie or a being with conscious phenomenal experience. We can certainly have an opinion or belief or assume that sufficient structure means consciousness, which is a perfectly reasonable stance to take and one that many would take, but we have to be careful to understand that's not a scientific stance since it isn't testable or falsifiable, which is why it's been called the "hard problem" of consciousness. It's an unfounded belief we choose out of reasons like psychological comfort.

    With regards to your latter point, I think you are making some sophisticated distinctions regarding the "map and territory" relation, and it seems you've hit upon the crux of the matter: how can we report "what its like" for us to experience something the other person hasn't experienced, if its not deconstructible to phenomenal states they've already experienced (and therefore constructible for them based off of our report)? The landmark paper here is "What Is It Like to Be a Bat?" by Josh Nagel, and if you're ever curious it's a pretty short read.

    With regards to "blindsight" since I'm not familiar with it and curious, how do we distinguish between loss of visual consciousness and loss of information transfer between conscious regions, or loss of memory about conscious experience?

    • I'm not sure how much, if any, work has been done to study the brains of people with blindsight. I'm also not sure I would differentiate between loss of visual consciousness and loss of information transfer ... my understanding is that it's the loss if information transfer that is causing the loss of consciousness (e.g maybe your visual cortex works fine, so you can see, and you can perform some visual tasks that have been well practiced and/or no longer need general association cortex, but if the connection between visual cortex and association cortex was lost, then perhaps this is where you become unaware of your ability to see, i.e. lose visual consciousness).

      I don't think it's a memory issue - one classic test of blindsight is asking the patient to navigate a cluttered corridor full of obstacles, which the patient succeeds in doing despite reporting themselves as blind - so it's a real-time phenomena, not one of memory.

    • > Yes, and that's pretty much exactly the point: we don't know of any way of determining whether someone is a p-zombie or a being with conscious phenomenal experience.

      That seems to come down to defining, in a non hand-wavy way, what we mean by "conscious phenomenal experience". If this is referring to personal subjective experience, then why is just asking them to report that subjective experience unsatisfactory ?!

      I get that consciousness is considered as some ineffable personal experience, but as a thought experiment, what if the experimenter, defining themselves as "conscious" wanted to probe if some subject's subjective experience differed from their own, then they could at least attempt to verbalize any and all aspects of their own (the experimeter's) subjective experience and ask the subject if they felt the same, and the more (unconstrained) questions they asked without finding any significant difference would make it asymptotically unlikely that there was any difference.

      > which is why it's [p-zombie detection] been called the "hard problem" of consciousness

      AFAIK the normal definition of the hard problem is basically how and why the brain gives rise to qualia and subjective experience, which really seems like a non-problem...

      We have thoughts and emotions, and mental access to these, so it has to feel like something to be alive and experience things. If we introspect on what having, say, vision, is like, or what it is like to have our eyes open vs shut, then (assuming we don't have blindsight!) we are obviously going to experience the difference and be able to report it - it does "feel" like something.

      Qualia are an interesting thing to discuss - why do we experience what we do, or experience anything at all for that matter when we see, say a large red circle. Why does red feel "red"? Why and how does music feel different in nature to color, and why does it feel the way it does, etc?

      I think these are also really non-problems that disappear as soon as you start to examine them! What are the differences in quales of seeing a small red circle vs a large red circle, or a large blue circle vs a large red one... When you consider differences in quales vs the fact that we experience anything at all (which is proved by our ability to report that we do). Color is perceived as surface attribute with a spatial extent, with colors differentiated by what they remind us of. Blue brings to mind water, sky and other blue things, Red brings to mind fire, roses, and other red things. Perception of color can be proven to be purely associative, not absolute, by Ivo Kohler's chromatic adaptation experiments, having the subject wear colored goggles whose effect "wears off" after a few days with normal subjective perception of color returning.

      5 replies →