Comment by MostlyStable
15 days ago
I'm talking about consciousness because that's what the parent comment was making claims about. They original claim was that LLMs are definitely not conscious. I responded that we don't understand consciousness well enough to make that claim. You responded that consciousness is not necessary for language. I do not dispute that claim but it's irrelevant to both the original comment and my reply. In fact, I agree, since I said that I think that LLMs are likely not conscious and they have obvious language ability, so I obviously don't think that language ability necessarily implies consciousness. I just don't think that that, alone, is enough to disprove their consciousness.
You, and the research you advice I look into, is answering a totally different question (unless you are suggesting that research has in fact solved the question of what human consciousness is, how it works, etc, in which case, I would love you to point me in the direction so I can read more).
I'm explaining that there is no need to question whether language production and consciousness imply one another, in either direction; there has for some time been sufficient research to demonstrate that they do not. I'm not giving you a detailed list of citations because the ones on Wikipedia are fine. Between the links and the search terms I've provided, I feel my responsibility to inform fully discharged, inasmuch as the notional horse has now been led to water.
That much, thankfully, does not require "the hard problem of consciousness" [1] be solved. The argument you're trying to have does require such a solution, which is why you see me so assiduously avoiding it: I know very well how far above my pay grade that is. Good luck...
[1] https://en.wikipedia.org/wiki/Hard_problem_of_consciousness
And I'm saying that the question of whether language production and consciousness imply one another is orthogonal to the argument. My argument, in it's simplest form, is that in order to confidently claim a non-human is not conscious, we would indeed need to solve the hard problem. We have not solved that problem, and therefore we should make no strong claims about the consciousness or lack thereof of any non-human.
I may have been imprecise in my original comment, if it led you to believe that I thought that language production was the only evidence or important thing. If so, I apologize for my imprecision. I don't think that it's really that relevant.
Oh, I see. That's too broad a claim in my view, but I would agree we can't be certain without a general solution of the 'hard problem' - no more about LLMs than about humans; in the general case, we can't prove ourselves conscious either, which is the sort of thing that tends to drive consciousness researchers gradually but definitely up a wall over time. (But we've discussed Hoel already. To his credit, he's always been very open about his reasons for having departed academia.)
It sounds to me as though you might seek to get at a concern less mechanistic than moral or ethical, and my advice in such case would be to address that concern directly. If you try to tell me that because LLMs produce speech they must be presumptively treated as if able to suffer, I'm going to tell you that's nonsense, as indeed I have just finished doing. If you tell me instead that they must be so treated because we have no way to be sure they don't suffer, I'll see no cause to argue. But I appreciate that making a compassionate argument for its own sake isn't a very good way to convince anyone around here.