Comment by marcellus23
1 day ago
It's very easy to say, "well, of course, a thing that looks like a duck, swims like a duck, and quacks like a duck, is not necessarily a duck." But when you're presented with something indistinguishable from a duck in every way, how do you determine whether it's a duck? You can't just say "well I know it's not a duck". It's dodging the question.
Well. AI doesn't walk or quack like a duck.
Ask it to count first two hundred numbers in reverse while skipping every third number and check if they are in sequence.
Check the car wash examples on YouTube.
If I picked a human off the street and asked them to "count first two hundred numbers in reverse while skipping every third number and check if they are in sequence", I bet most would screw up.
my point is not that current LLMs are sentient, or even that LLMs ever could be. My point is that it's very difficult to come up with a way to test consciousness, and it makes me a bit nervous to see people suggesting that something could never be conscious just because it's technological and not biological.