Comment by bogtog
4 hours ago
Unfortunately, I also don't want other people to interact with a sycophantic robot friend, yet my picker only applies to my conversation
4 hours ago
Unfortunately, I also don't want other people to interact with a sycophantic robot friend, yet my picker only applies to my conversation
Hey, you leave my sycophantic robot friend alone.
Sorry that you can't control other peoples lives & wants
This is like arguing that we shouldn't try to regulate drugs because some people might "want" the heroin that ruins their lives.
The existing "personalities" of LLMs are dangerous, full stop. They are trained to generate text with an air of authority and to tend to agree with anything you tell them. It is irresponsible to allow this to continue while not at least deliberately improving education around their use. This is why we're seeing people "falling in love" with LLMs, or seeking mental health assistance from LLMs that they are unqualified to render, or plotting attacks on other people that LLMs are not sufficiently prepared to detect and thwart, and so on. I think it's a terrible position to take to argue that we should allow this behavior (and training) to continue unrestrained because some people might "want" it.
What's your proposed solution here? Are you calling for legislation that controls the personality of LLMs made available to the public?
3 replies →
Comparing LLM responses to heroine is insane.
3 replies →
Pretty sure most of the current problems we see re drug use are a direct result of the nanny state trying to tell people how to live their lives. Forcing your views on people doesn’t work and has lots of negative consequences.
3 replies →
Who are you to determine what other people want? Who made you god?
1 reply →
so good.
ChatGPT 5.2: allow others to control everything about your conversations. Crowd favorite!