Comment by AaronAPU
14 days ago
I don’t know how people keep explaining away LLM sentience with language which equally applies to humans. It’s such a bizarre blindspot.
Not saying they are sentient, but the differentiation requires something which doesn’t also apply to us all. Is there any doubt we think through statistical correlations? If not that, what do you think we are doing?
We are doing while retraining our "weights" all the time through experience, not holding a static set of weights that mutate only through a retraining. This constant feedback, or better "strange loop", is what differentiates our statistical machinery at the fundamental level.
This is, in my opinion, the biggest difference.
ChatGPT is like a fresh clone that gets woken up every time I need to know some dumb explanation and then it just gets destroyed.
A digital version of Moon.
The language points to concepts in the world that AI has no clue about. You think when the AI is giving someone advice about their love life it has any clue what any of that means?
How often does ChatGPT get retrained? How often does a human brain get retrained?
There's a very material difference.