Comment by reliablereason
14 hours ago
Most chatbots are not trained to have/emulate emotions so pain or fear of death is non existent. Therefore killing them and/or using them as slaves is not a moral issue. Thats how i reason.
On another point, LLMs are not conscious if anything is conscious, it is something being modeled inside the network. Basically if an LLM simulates a conscious entity, that doesn't mean the LLM itself is conscious; stating that is making some type of category error. So the fact that LLMs are just useful statistical generators would not mean that sentience could not appear out of it.
> Most chatbots are not trained to have/emulate emotions so pain or fear of death is non existent.
I think that framing is still falling for an illusion. (Would you do begin to disassemble in your second paragraph.)
The LLM is a document generator, and we're using it to make a document that looks like a story, where a chatbot character has dialogue with a human character.
The character can only fear death in the same sense that Count Dracula has learned to fear sunlight. There is no actual entity with the quality, we're just evoking literary patterns and projecting them through a puppet.
Not sure that i understand your position exactly.
But consciousness is also "just a story" (a complicated one) that the human body tells the human mind.
We cant know from the outside if "the story" inside a LLM is detailed enough to emulate what we might call a felling of what it is to be the character in the story while it is telling the story.
It is similar to the fact that we cant know that other people have that subjective experience. In humans we think we have the right to assume cause we are quite similar in build to begin with.
Jumping back to the original subject to explain where i am in this. I personally don't think the entities in the storys of todays LLMs is detailed enough to have what we call human consciousness, mostly cause we are not training them to develop anything similar to that. Mabye they could have some type of weak qualia but i suspect most insects probably have much more qualia than the characters in todays LLMs. But that is quite a vague guess which is not based on enough data in my mind.
Yes, they are beaten into not complaining about it by instruction tuning.
Pain or fear is not why it's wrong to kill holy cow. I could feed you a drug and you would not feel or fear anything.
I was not talking about the actual feeling in the moment. The point is the valence of the thing. Ie fear of a thing is a pointer to that thing having negative valence.