Comment by stoneyhrm1

6 months ago

I mean you can boil anything down to it's building blocks and make it seem like it didn't 'decide' anything. When you as a human decide something, your brain and it's neurons just made some connections with an output signal sent to other parts that resulting in your body 'doing' something.

I don't think LLMs are sentient or any bullshit like that, but I do think people are too quick to write them off before really thinking about how a nn 'knows things' similar to how a human 'knows' things, it is trained and reacts to inputs and outputs. The body is just far more complex.

I wasn't talking about knowing (they clearly encode knowledge), I was talking about thinking/reasoning, which is something LLMs do not in fact do IMO.

These are very different and knowledge is not intelligence.

  • To me all of those are so vaguely defined that arguing whether an LLM is "really really" doing something is kind of a waste of time.

    It's like we're clinging on to things that make us feel like human cognition is special so we're saying LLM's arent "really" doing it, then not defining what it actually is.