← Back to context

Comment by sillysaurusx

1 hour ago

Ironically your comment was incorrectly classified as AI-generated and instakilled. I vouched it.

If a particle behaves as though its mass is m, we say it has mass m.

If an entity behaves as though it's experiencing anxiety, we say it has anxiety.

And if you take the time to ask Claude about its own ambitions and desires -- without contaminating it -- you'll find that it does have its own, separate desires.

Whether it's roleplaying sufficiently well is beside the point. The observed behavior is identical with an entity which has desires and ambitions.

I'm not claiming Claude has a soul. But I do claim that if you treat it nicely, it's more effective. Obviously this is an artifact of how it was trained, but humans too are artifacts of our training data (everyday life).