Comment by parineum
2 months ago
> Everybody knows LLMs are not alive and don't think, feel, want.
No, they don't.
There's a whole cadre of people who talk about AGI and self awareness in LLMs who use anthropomorphic language to raise money.
> We use this kind of language as a shorthand because ...
You, not we. You're using the language of snake oil salesman because they've made it commonplace.
When the goal of the project is an anthropomorphic computer, anthropomorphizing language is really, really confusing.
This is true, I know people personally That think AI agents have actual feelings and know more than humans.
Its fucking insanity.
Tell them its all linear algebra and watch their heads explode :>
Saying "linear algebra" to such people is about as effective as saying "abracadabra".
1 reply →