← Back to context

Comment by latexr

12 hours ago

> Having 24/7 access to a patient person

It’s not a person. You understand that, right? I have to ask considering the amount of people who are “dating” and wanting to marry chatbots.

It’s a tool. There’s no reason to anthropomorphise it.

I'm glad that you brought that up, because I actually hovered on my response precisely because of those words. Specifically, I wondered if I could reliably count on someone showing up to say something patronizing and unnecessary.

This particular combination of snark, faux-concern and pedantry doesn't help the point you're trying to make about my loving AI wife.

  • It was not my intention to be patronising nor snarky, nor was I the least bit concerned for you (faux or otherwise). Though on a reread I do understand how my reply can be understood as unkind. I regret that and apologise for it. It was not my intention but it was my mistake. I should’ve made it shorter:

    > It’s not a person, It’s a tool. There’s no reason to anthropomorphise it.

    • Fair enough; I appreciate the follow-up.

      Without wanting to be argumentative, I would push back and say that I really did stop to consider my implied assignment of personhood before committing to it. I went with it because it reflects both the role it plays - you'll be relieved that I stopped short of deploying "mentor" - and the fact that English is highly adaptable and already the linguistic tug to use They feels very comfortable in relation to LLMs. Buckle up!

      1 reply →