Comment by HDThoreaun
6 months ago
Hard problem of consciousness seems way harder to wolve than the easy one which is a purely engineering problem. People have been thinking about why the brain thinks for a very long time and so far we have absolutely no idea.
> People have been thinking about why the brain thinks for a very long time and so far we have absolutely no idea
I'm not sure what you mean by this.
I think there is a pretty large consensus that our neocortex is a prediction machine (predicting future observations/outcomes from past experience), and the reason WHY it would have evolved to be this is because there is obvious massive survival benefit in successfully predicting how predators and prey will react ahead of time, what will be the outcome of your own actions, etc, etc. Prediction unlocks you from being stuck in the present having to react to things as they happen and lets you plan ahead.
Thinking = Reasoning/Planning is just multi-step prediction.
I don't think consciousness is the big deal most people think it is - it seems to be just the ability to self-observe (which helps to self-predict), but if we somehow built AGI that wasn't conscious, then who cares?
Not why it was created, why the systems in the brain lead to consciousness. Your option B requires understanding not just mechanically how but the fundamental reason for why consciousness appears. If we just understand the mechanics all we can confidently do is work toward a more and more accurate representation of the brain. AGI without consciousness is speculated but hard for me to believe in.
As noted, consciousness seems to just be the ability to self-observe, which is useful as another predictive input.
I would expect that all intelligent animals are conscious, and any AI we build with a roughly brain-like architecture in terms of connections, looping, and being prediction based would also report itself to be conscious and describe a similar subjective experience. LLMs seem much too simple (just layer-wise pass-thru data flow) to be conscious.
It's possible that some of the neural connections supporting consciousness may have evolved, or been enhanced, due to the evolutionary value of enhanced self-prediction (i.e. this is the reason), but as noted I expect it basically "comes for free" with any complete enough cognitive/sensory architecture.
10 replies →