Comment by TimTheTinker
19 hours ago
> It's patently insane to demand that humans alter their behavior to accommodate the foibles of mere machines
Talking to chatbots is like taking a placebo pill for a condition. You know it's just sugar, but it creates a measurable psychosomatic effect nonetheless. Even if you know there's no person on the other end, the conversation still causes you to functionally relate as if there is.
So this isn't "accommodating foibles" with the machine, it's protecting ourselves from an exploit of a human vulnerability: we subconsciously tend to infer intent, understanding, judgment, emotions, moral agency, etc. to LLMs.
Humans are wired to infer these based on conversation alone, and LLMs are unfortunately able to exploit human conversation to leap compellingly over the uncanny valley. LLM engineering couldn't be better made to target the uncanny valley: training on a vast corpus of real human speech. That uncanny valley is there for a reason: to protect us from inferring agency where such inference is not due.
Bad things happen when we relate to unsafe people as if they are safe... how much more should we watch out for how we relate to machines that imitate human relationality to fool many of us into thinking they are something that they're not. Some particularly vulnerable people have already died because of this, so it isn't an imaginary threat.
> So this isn't "accommodating foibles" with the machine, it's protecting ourselves from an exploit of a human vulnerability: we subconsciously tend to infer intent, understanding, judgment, emotions, moral agency, etc. to LLMs.
Right, I'm saying that this framing is backwards. It's not that poor little humans are vulnerable and we need to protect ourselves on an individual level, we need to make it illegal and socially unacceptable to use AI to exploit human vulnerability.
Let me put it another way. Humans have another weakness, that is, we are made of carbon and water and it's very easy to kill us by putting metal through various fleshy parts of our bodies. In civilized parts of the world, we do not respond to this by all wearing body armor all the time. We respond to this by controlling who has access to weapons that can destroy our fleshy bits, and heavily punishing people who use them to harm another person.
I don't want a world where we have normalized the use of LLMs where everyone has to be wearing the equivalent of body armor to protect ourselves. I want a world where I can go outside in a T-shirt and not be afraid of being shot in the heart.
I think you're mixing up the laws and the implementation/enforcement. There's nothing wrong with moral laws around behavior (you shall not kill), but you're right that society-wide enforcement requires laws and repercussions. It sounds more like to agree with the laws and want them enforced.
Ah, I see, you are not American.
In the US we don't have the luxury of believing our governments will act in the interests of the voters.
I had a similar thought, that parent commenter sounded like they were in Canada or something. Interesting that their solution is to impose constraints on technological process, rather than finding novel ways to elevate individual and collective human functioning in spite of our limitations. Ironically it's his view that is more anti-human
You’re committing a much older but related sin here: assigning agency and motivation to evolutionary processes. The uncanny valley is the product of evolution and thus by definition it has no “purpose”
I reject the premise that the universe, the earth, and human existence is without purpose. It's one premise among several, and not one I subscribe to.
At least 80% of people agree with me, so I'm not holding to a fringe idea.
I didn’t say any such thing like the universe has no purpose. Merely that in a scientific sense evolution has no motivation. It is an emergent phenomenon which tends to maximize fitness to reproduce and cannot be said to do anything for a reason. Saying otherwise is just anti-science.
Do Hindus and Buddhists generally agree there is a purpose? Perhaps too escape suffering and reincarnation? Sounds more like a western theistic view of existence. Like the deity has a plan for everyone's life kind of thing.
Well yes because just like your earlier point, we can't help but anthropomorphise the world around us.
Just like we see a person in an LLM, it's easy to assume that because we create things with a purpose, that the world around us also has to be that way. But it's just as wrong and arguably far more dangerous.
>At least 80% of people agree with me, so I'm not holding to a fringe idea.
Appeal to majority much?
3 replies →
> is the product of evolution and thus by definition it has no “purpose”
But as most things that appeared in evolution, it perhaps helped at least some individuals until sexual maturity and successful procreation.
Agreed. Thats far off from what parent said, which is what the “purpose” of the uncanny valley is.
> You know it's just sugar,
That is not the definition of a placebo.
You take the placebo (whatever it is: could be a pill; could be some kind of task or routine) and you believe it is medicine; you believe it to be therapeutic.
The placebo effect comes from your faith, your belief, and your anticipation that it will heal.
If the pharmacist hands you a pill and says, “here, this placebo is sugar!” they have destroyed the effect from the start.
Once on e.r. I heard the physicians preparing to administer “Obecalp”, which is a perfectly cromulent “drug brand”, but also unlikely to alert a nearby patient about their true intent.
> That is not the definition of a placebo.
But, puzzlingly enough, it's the definition of open-label placebo, in which the patient is told they've been given a placebo. And some studies show there is a non-insignificant effect as well, albeit smaller (and less conclusive) than with blind placebo.
This is exactly what I meant. Poor specificity on my part.
One, a placebo does not need to be given blindly. A sugar pill is a placebo, even if the recipient knows about it.
An actual definition: "A placebo is an inactive substance (like a sugar pill) or procedure (like sham surgery) with no intrinsic therapeutic value, designed to look identical to real treatment." No mention of the user's belief.
Two, real hard data proves that the placebo effect remains (albeit reduced) even if the recipient knows about it. It's counter-intuitive, but real.
https://en.wikipedia.org/wiki/Placebo#Psychology
The hypotheses hinge on the beliefs of the recipients. "The placebo effect" has always been largely psychological. That's the realm of belief.
To veer even further off-tangent, isn't it hilarious how the Wikipedia illustration of old Placebo bottles indicate that "Federal Law Prohibits Dispensing without a Prescription". Wouldn't want some placebo fiend to O.D.
1 reply →
Rubber duck debugging, now with droughts.