← Back to context

Comment by andrewflnr

4 days ago

An agent is notably not stateless.

Yes, but the state is just the prompt and the text already emitted.

You could assert that text can encode a state of consciousness, but that's an incredibly bold claim with a lot of implications.

  • It's a bold claim for sure, and not one that I agree with, but not one that's facially false either. We're approaching a point where we will stop having easy answers for why computer systems can't have subjective experience.

  • You're conflating state and consciousness. Clawbots in particular are agents that persist state across conversations in text files and optionally in other data stores.

    • I am not sure how to define consciousness, but I can't imagine a definition that doesn't involve state or continuity across time.

      2 replies →