← Back to context

Comment by leggerss

3 hours ago

I'm gonna get philosophical (because that's what art should make us do, right?): this points to arguably _the_ deepest question we can ask about LLMs right now—are they conscious?

It's a personal decision, and what matters is what you do after pondering it. Do you act like they're nothing more than next token predictors, deeply intricate digital mechanisms whose cranks turn with flowing electrons? Or do you err on the side of care, that there's a type of consciousness on the other side of the glass?

Humanity has a long history of underestimating non-human minds. I know which side I'm on.

How have we underestimated them? We’ve absolutely dominated every single ecosystem that we push in to. We’ve long ago destroyed all of the minds that could have possibly challenged us, all the other members of the Homo family. The reason humanity makes an estimation that we’re the smartest is because for the last 40,000 years, it’s been the truth, proved over countless eons.

  • Being smart and being capable are two different things. Human beings are very capable

    • Being smart is a waste of calories if you don't use it to become more capable. At least in evolved systems, I wouldn't expect to find intelligence without capability.