Comment by margalabargala
20 hours ago
I agree, which is why it's disappointing that you were so eager to point out that "The LLM cannot want" that you completely missed how I did not claim that the LLM wanted.
The original comment had the exact verbose hedging you are asking for when discussing technical subjects. Clearly this is not sufficient to prevent people from jumping in with an "Ackshually" instead of reading the words in front of their face.
> The original comment had the exact verbose hedging you are asking for when discussing technical subjects.
Is this how you normally speak when you find a bug in software? You hedge language around marketing talking points?
I sincerely doubt that. When people find bugs in software they just say that the software is buggy.
But for LLM there's this ridiculous roundabout about "pattern matching behaving as if it wanted something" which is a roundabout way to aacribe intentionality.
If you said this about your OS people qould look at you funny, or assume you were joking.
Sorry, I don't think I am in the wrong for asking people to think more critically about this shit.
> Is this how you normally speak when you find a bug in software? You hedge language around marketing talking points?
I'm sorry, what are you asking for exactly? You were upset because you hallucinated that I said the LLM "wanted" something, and now you're upset that I used the exact technically correct language you specifically requested because it's not how people "normally" speak?
Sounds like the constant is just you being upset, regardless of what people say.
People say things like "the program is trying to do X", when obviously programs can't try to do a thing, because that implies intention, and they don't have agency. And if you say your OS is lying to you, people will treat that as though the OS is giving you false information when it should have different true information. People have done this for years. Here's an example: https://learn.microsoft.com/en-us/answers/questions/2437149/...
I hallucinated nothing, and my point still stands.
You actually described a bug in software by ascribing intentionality to a LLM. That you "hedged" the language by saying that "it behaved as if it wanted" does little to change the fact that this is not how people normally describe a bug.
But when it comes to LLMs there's this pervasive anthropomorphic language used to make it sound more sentient than it actually is.
Ridiculous talking points implying that I am angry is just regular deflection. Normally people do that when they don't like criticism.
Feel free to have the last word. You can keep talking about LLMs as if they are sentient if you want, I already pointed the bullshit and stressed the point enough.
1 reply →