← Back to context

Comment by katabasis

2 days ago

Yeah pretty much this. One can argue that it’s idiotic to treat chatbots like they are alive, but if a bit of misplaced empathy for machines helps to discourage antisocial behavior towards other humans (even as an unintentional side effect), that seems ok to me.

As an aside, I’m not the kind of person who gets worked up about violence in video games, because even AAA titles with excellent graphics are still obvious as games. New forms of technology are capable of blurring the lines between fantasy and reality to a greater degree. This is true of LLM chat bots to some degree, and I worry it will also become a problem as we get better VR. People who witness or participate in violent events often come away traumatized; at a certain point simulated experiences are going to be so convincing that we will need to worry about the impact on the user.

> People who witness or participate in violent events often come away traumatized

To be fair it seems reasonable to entertain the possibility of that being due to the knowledge that the events are real.