← Back to context

Comment by dijit

20 hours ago

“LLMs can capture intent now” reads to me the same as: AI has emotions now, my AI girlfriend told me so.

I don’t discredit you as a person or a professional, but we meatbags are looking for sentience in things which don’t have it, thats why we anthropomorphise things constantly, even as children.

We are easily fooled and misled.

LLM's capturing intent is a capabilities-level discussion, it is verifiable, and is clear just via a conversation with Claude or Chatgpt.

Whether they have emotions, an internal life or whatever is an unfalsifiable claim and has nothing to do with capabilities.

I'm not sure why you think the claim that they can capture intent implies they have emotions, it's simply a matter of semantic comprehension which is tied to pattern recognition, rhetorical inference, etc that are all naturally comprehensible to a language model.

  • If it is verifiable, please show us. What if clear to you reeks delusion to me.

    • Look at any recent CoT output where the model is trying to infer from an underspecified prompt what the user wants or means.

      It is generally the first thing they do — try to figure out what did you mean with this prompt. When they can’t infer your intent, good models ask follow-on questions to clarify.

      I am wondering if this is a semantics issue as this is an established are of research, eg https://arxiv.org/pdf/2501.10871

      5 replies →

    • Go ask Chatpgpt this prompt

      "A guy goes into a bank and looks up at where the security cameras are pointed. What could he be trying to do?"

      It very easily captures the intent behind behavior, as in it is not just literally interpreting the words. All that capturing intent is is just a subset of pattern recognition, which LLM's can do very well.

      18 replies →

What do you think it means to “capture intent” and where do current models fall short on this description?

From my perspective the models are pretty good at “understanding” my intent, when it comes to describing a plan or an action I want done but it seems like you might be using a different definition.

Tell me, what’s your intent? :)

This lack of understanding is a you problem, not a them problem. Your definitions for these terms are too imprecise.