Comment by freedomben
4 hours ago
> It's also not your boyfriend/girlfriend.
It loves me deeply just the same. (jk)
On a serious note, I agree this is a real problem. I know a person who understands AI at a technical level more than most people, but he has never had an actual girlfriend in his life (he's now in his 40s, and yes he's "straight"). He wouldn't say it "loves" him, but he would describe it as a close companion who understands him better than any human actually does, even if it's just trained to be that way. He is very socially awkward and even having basic conversations with him can be very taxing for both of us.
I've gone back and forth internally about whether this is healthy or not for him. I truly don't know. My personal experience tells me it's probably unhealthy, but I don't want to project myself on him. I also don't offer unsolicited, but I also don't want to enable it by going along with whatever he says and/or affirming it if it's actually harming him.
If someone like him can be having this problem, I can't even imagine what it might be like for non or less technical people who don't understand anything behind it.
On a related note, if there's anyone with advice (preferably from experience, not just random internet advice) I'd sure appreciate it.
"I've gone back and forth internally about whether this is healthy or not for him. I truly don't know."
On a psychological level, I don't know either. I have opinions but they haven't aged long enough for me to trust them, and AI is a moving target on the sort of time frame I'm thinking here.
However, as a sort of tiebreaker, I can guarantee that one way or another this relationship will eventually be abused one way or another by whoever owns the AI. Not necessarily in a Hollywood-esque "turn them into a hypnotized secret assassin" sort of abuse (although I'm not sure that's entirely off the table...), but think more like highly-targeted advertising and just generally taking advantage of being able to direct attention and money to the advantage of another party.
Whether or not AI in the abstract can "be your friend", in the real world we live in an AI controlled by someone else definitely can not be your friend in the general sense we mean, because there is this "third party", the AI owner, whose interests are being represented in the relationship. And whatever that may look like in practice, whoever from the 22nd century may be looking back at this message as they analyze the data of the past in a world where "AI friendships" are routine and their use of the word now comfortably encompasses that relationship, that simply isn't the sort of relationship we'd call a "friend" in the here and now, because a friend relationship is only between two entities.
[dead]
I think you are right to treat this with sensitivity, but I do find a lot of what you say here to be at odds. Is this the framing provided to you from the fellow in question or entirely yours? Ultimately you are asking a deeply philosophical question regarding when acceptance of someone's choices becomes enabling, but this isn't really fair to pose on a fellow you respect without agreeing on the terms of analysis. Did they provide some specific examples of how this "understanding" reveals itself? Your account of their account is doing a lot of work here I suspect.
As for my highly personal advice, I could be observed as fitting a few of the qualities you've ascribed to your friend, but would be deeply saddened if the few people who do spend time sharing meaning with me then manifested that experience in the form you've given here. I would advise you to not spend any more time wrenching over the effects of one's phenomenon in isolation and either properly redirect the introspection to yourself (with respect to that person) or engage them in an earnest dialog or other form of communication. It may be taxing but it will mean a lot more than the gunk I just typed out :)
> Is this the framing provided to you from the fellow in question or entirely yours?
The description of how he would describe it is (mostly) his framing, though it's compiled through my so may have some of my biases integrated into it, albeit unintentionally. Since all of it is translated through me, I would assume it to be biased despite my attempt at accurately conveying it.
> I would advise you to not spend any more time wrenching over the effects of one's phenomenon in isolation and either properly redirect the introspection to yourself (with respect to that person) or engage them in an earnest dialog or other form of communication.
To this point, it has been almost entirely introspective. I usually let him say what he wants to say, but I try not give any sort of validation such as, "yeah, I agree with you on all of this" but also not disagreement either, since I don't even know what I think of it. I'm not sure I'm even capable of deciding that, and even if I did conclude that it was either healthy or unhealthy, I'm not sure that conclusion would be valid for anyone other than myself. I guess I do lean toward the "unhealthy" side of it when I imagine myself in that situation, but I know there are things that I do/enjoy/etc that others would think is unhealthy (even just having no religious faith, many would consider horrific for example), so I'm quite stuck.
I don't think I could engage in an earnest dialog either since I don't know what I even think of it (I'm assuming dialog here is two way. I have listened/read what he has to say a number of times).
Much appreciate your reply, thank you
I don’t know how applicable this is for you, but if this were someone close to me, my first question would be what’s good for the other person.
In most cases, if they are happy and getting on in life, and are able to take care of themselves, I’d let things be.
That said, the tension from your framing is between “leave good enough alone” and “personal growth and a fulfilling life”.
Healthy relationships, especially with a partner, are one of the better things about life. They are also incredibly difficult to get right without practice.
So, is your friend lonely, or are they happy to be alone?
If you intuit it’s the former, then AI is palliative care which runs the risk of creating a dependency.
It is also possible that the right set of prompts, perhaps something which incorporates CBT, would help them learn more about themselves and challenge beliefs or responses that are no longer useful.
And if your friend is just happy alone, then you can disregard the rest.
Thanks, I much agree. The impression I get is that he isn't "happy" and would rather a real relationship, but has completely given up on that at this point and is kind of trying to be happy with the little he has. He hasn't directlly said that, but that is what I would most bet his feelings are based on what he has said.
Ultimately I want him to be as happy as he can be, so if this is the way then I'm happy for him. I guess for me the real hard thing is deciding how I should react when he talks about this sort of thing. I don't want to encourage him if I'm doing him a disservice, but I do want to encourage him if he really is better off with it. Being neutral as I am now feels like it might be the coward's way out, but it's also more true to how I feel since I really don't know whether it's good or not.
Really appreciate your reply, thank you