← Back to context

Comment by Terr_

7 hours ago

> Most chatbots are not trained to have/emulate emotions so pain or fear of death is non existent.

I think that framing is still falling for an illusion. (Would you do begin to disassemble in your second paragraph.)

The LLM is a document generator, and we're using it to make a document that looks like a story, where a chatbot character has dialogue with a human character.

The character can only fear death in the same sense that Count Dracula has learned to fear sunlight. There is no actual entity with the quality, we're just evoking literary patterns and projecting them through a puppet.

Not sure that i understand your position exactly.

But consciousness is also "just a story" (a complicated one) that the human body tells the human mind.

We cant know from the outside if "the story" inside a LLM is detailed enough to emulate what we might call a felling of what it is to be the character in the story while it is telling the story.

It is similar to the fact that we cant know that other people have that subjective experience. In humans we think we have the right to assume cause we are quite similar in build to begin with.

Jumping back to the original subject to explain where i am in this. I personally don't think the entities in the storys of todays LLMs is detailed enough to have what we call human consciousness, mostly cause we are not training them to develop anything similar to that. Mabye they could have some type of weak qualia but i suspect most insects probably have much more qualia than the characters in todays LLMs. But that is quite a vague guess which is not based on enough data in my mind.