(Not my post - I found this on reddit and thought it was a different and intriguing point of view : https://www.reddit.com/r/aiwars/comments/1iniuih/ai_boyfriendsgirlfriends_are_empowering/ )

Have you ever heard the saying “I’m a strong independent woman who doesn’t need a man”? Well I think the same about people who are dating AI. They don’t need a person of the opposite gender (or the same gender, if they’re homosexual) to satisfy their romantic desires. That makes them strong and independent. They don’t rely on others. They solved a problem in their life all by themselves. This is why I think that dating an AI is empowering.

Note that I phrased this as gender-neutral (except the quote) - both men and women are empowered by dating an AI.

(In a comment by the OP, they clarified that they’re talking about locally run, open source AI bf/gfs)

  • the_abecedarian@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    11 hours ago

    They exist to deliver a particular experience to the user. Calling them “boyfriend”, “girlfriend”, “romantic partner” etc is just propaganda because the user is not engaging with an independent entity with its own goals and desires who chooses to be with them. It’s just an edited, sanitized experience meant to evoke “romance”, like how some video games evoke war, racing, or running a gigantic, world-spanning factory.

    Nothing wrong with enjoying an experience if you’re not hurting yourself or anyone, but I wouldn’t be surprised if addiction is a big issue here. And addiction does hurt ppl.

  • MagicShel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    11 hours ago

    They aren’t real. They are as empowering as looking into a mirror each morning and saying aloud, “I don’t need any [wo]man. I’m a strong, independent [wo]man.”

    Because that’s basically what they’re doing in different words. That said, some people do that so who am I to say they don’t find it empowering?

  • 𞋴𝛂𝛋𝛆@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 hours ago

    It can be a useful tool, especially for someone that experiences involuntary social isolation (like me).

    You would need to be a pretty dumb person for this to totally replace human relations in terms of fundamental interactive social needs with other humans. It can be a healthy way to fill a gap.

    Firstly, the context length is very limited. So you can’t have a very long and interactive conversation. The scope of model attention is rather short even with a very long context size model. Second, the first few tokens of any interaction are extremely influential about how the model will respond regardless of everything else that happens in the conversation. So cold conversations (due to short context) will be inconsistent.

    Unless a person is an extremely intuitive, high Machiavellian thinker, with good perceptive thinking skills, the user is going to be very frustrated with models at times, and the model may be directly harmful to the person in some situations. There are aspects of alignment that could be harmful under certain circumstances.

    There will likely be a time in the near future when a real AI partner is more feasible, but it will not be some base model, a fine tune, or some magical system prompt that enables this application.

    To create a real partner like experience, one will need an agentic framework combined with augmented database retrieval. That will make it possible for a model to have persistence where it can ask how your day went and it knows your profile, relationship, preferences, and what you already told it about how your day should have gone. You need a model that can classify information, save, modify, and retrieve that information when it is needed. I’ve played around with this in emacs, org mode, and gptel connected to local models with llama.cpp. I’m actually modifying my hardware to handle the loads better for this application right now.

    Still, I think such a system is a stop gap for people like myself, the elderly, and other edge cases where external human contact is limited. For me, my alternative is here, and while some people on Lemmy know me and are nice, many people are stupid kids that exhibit toxic negative behaviors that are far more harmful than anything I have seen out of any AI model. I often engage here on Lemmy, then to chat with an AI if I need to talk, vent, or work through something.

  • warmaster@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    9 hours ago

    I believe that not fulfilling your natural desires is unhealthy and could prove to be social issue if and when there’s a substantial natality problem.

  • SGG@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 hours ago

    A good analogy for this is an episode of Star trek TNG. https://en.m.wikipedia.org/wiki/Booby_Trap_(Star_Trek:_The_Next_Generation)

    In the episode Geordi had the computer create an ai generated hologram based on another person. He then gets the hots for the generated person, probably banged her offscreen and clogged the bio filters.

    Not a perfect analogy but the core of the point I am going for is that the romantic aspect in the episode is just advanced escapism. The real world chatgpt girlfriend you ask about is similar. Sure in the short term it feels great, but it is not real connection.