Apparently there are several narratives in regards to AI girlfriends.

  1. Incels use AI girlfriends given that they can do whatever they desire.
  2. Forums observing incel spaces agree that incels should use AI girlfriends to leave real women alone
  3. The general public having concerns towards AI girlfriends because their users might be negatively impacted by their usage
  4. Incels perceiving this as a revenge fantasy because “women are jealous that they’re dating AI instead of them”
  5. Forums observing incel spaces unsure if the views against AI girlfriends exist in the first place due to their previous agreement

I think this is an example of miscommunication and how different groups of people have different opinions depending on what they’ve seen online. Perhaps the incel-observing forums know that many of the incels have passed the point of no return, so AI girlfriends would help them, while the general public perceive the dangers of AI girlfriends based on their impact towards a broader demographic, hence the broad disapproval of AI girlfriends.

  • @rufus@discuss.tchncs.de
    link
    fedilink
    6
    edit-2
    2 months ago

    There are narratives entirely without incels. For example the 2013 movie “Her”. Or a bunch of other movies and TV series.

    The entire TV series “Westworld” is exactly about this.

    Also the picture in the community of hobbyists is quite more diverse. And I don’t see science reducing it to that either. I’m currently reading a long paper about chatbot ethics. There are more comprehensive articles like “The man of your dreams” or “I tried the Replika AI companion”. But I’ve heard the narrative you described, too. I’m not sure where you’d like to go with this conversation… I don’t think it has anything to do with miscommunication. I see people having narrow and uneducated perspectives on all kinds of things…

    Is there a broad disapproval? I can see how it’s a controversial topic and kind of taboo, you probably wouldn’t disclose this to your family, friends and co-workers. And it probably can manoeuvre you into a corner and make you even more lonely. But the same applies to playing video games or other hobbies.

    And the big tech companies also are very cautious about AI companions. OpenAI, Google etc all cut down severely on this use-case. They put quite some effort in so you can’t use ChatGPT as a friend or antropomorphize it.

    Regarding “incels”: I think there are two or three big articles about that, which I’ve read. “Men Are Creating AI Girlfriends and Then Verbally Abusing Them” comes to my mind. In the end I can’t really empathize with incels. I don’t understand or “feel” their perspective on the world. They do all kinds of harmful stuff and brag about it online. I’m not sure what to make of this.

    • @pavnilschanda@lemmy.worldOPM
      link
      fedilink
      2
      edit-2
      2 months ago

      Thanks for your input. I agree with your overall comment. Within general narratives, incels aren’t usually included. As for the broad disapproval, it’s something that I tend to notice in the AI space.

      AI chatbot personas are generally seen as a hobby, a one-and-done thing compared to an “entity” that accompanies you for long periods of time; the latter part has more stigma attached to it. And given the AI boom only a few years ago, many people, including academic researchers, have only started to be aware of its existence and have made many uninformed assumptions about them. Not to mention the ethical minefelds that are yet to be explored, and increasingly so within the humanities such as psychology and anthropology, hence the Google Deepmind article that you shared. Given the sheer complexity surrounding AI companionship combined with the attention-based economy that has shaped our society, it makes sense that non-specialized places would adopt a binary approach when it comes to AI, artificial girlfriends included.

      There seems to be strong connections between inceldom and AI companionship, given that AI girlfriends are marketed for lonely men, and many of them just happen to be incels. But as you’ve said, AI companion users are very diverse, it’s just that the topic of incels or an incel-related topic would get brought up every now and then within the AI companionship discourse.

      • @rufus@discuss.tchncs.de
        link
        fedilink
        2
        edit-2
        2 months ago

        Hmmh. I’m pretty sure OpenAI and Google are very aware of this. I mean erotic roleplay is probably out of the question since they’re American companies. And the whole field of AI is a minefield to them starting with copyright to stuff like this. And they did their homework and made the chatbots not to present themselves as emotive. I percieve this as concensus in society, that we need to be cautious about the effects on human psyche. I wonder if that’s going to shift at some point. I’m pretty sure more research is going to be done and AI will become more and more prevalent anyways, so we’re going to see whether people like it or not.

        And as I heard lonelyness is on the rise. If not in western cultures, I think Japan and Korea are way ahead of us. And the South Koreans seem also to have a problem with a certain kind of incel culture, which seems to be way worse and more widespread amongst young men, there. I’ve always wanted to read more about that.

        I - myself - like AI companions. I think it’s fantasy. Like reading a book, playing video games or watching movies. We also explore the dark sides of humans there. We write and read murder mystery stories, detailing heinous acts. We kill people in video games. We process abuse and bad things in movies. And that’s part of being human. Doing that with chatbots is the next level, probably more addictive and without some of the limitations of other formats. But I don’t think it’s bad per se.

        I don’t know what to say to people who like to be cruel, simulate that in a fantasy like this. I think if they’re smart enough to handle it, I’m liberal enough not to look down on them for that. If being cruel is all there is to someone, they’re a poor thing in my eyes. Same for indulging in self-hatred and pity. I can see how someone would end up in a situation like that. But there’s so much more to life. And acting it out on (the broad concept of) women isn’t right or healthy. And it’s beyond my perspective. From my perspective there isn’t that big a difference between genders. I can talk to any of them and ultimately their interests and needs and wants are pretty much the same.

        So if an incel were to use a chatbot, i think it’s just a symptom for the underlying real issue. Yes it can reinforce them. But some people using tools for their twisted purposes, doesn’t invalidate other use cases. And it’d be a shame if that narrative were to dominate public perspective.

        I often disagree with people like Mark Zuckerberg, but I’m grateful he provides me with large language models that aren’t “aligned” to their ethics. I think combatting loneliness is a valid use case. Even erotic roleplay and exploring concepts like violence in fantasy scenarios ultimately is a valid thing to do in my eyes.

        There is a good summary on Uncensored Models by Eric Hartford which I completely agree with. I hope they don’t ever take that away from us.

  • @retrospectology@lemmy.world
    link
    fedilink
    22 months ago

    Encouraging AI girlfriends is harmful in the same way that encouraging pedophiles to indulge in their fantasies using AI is harmful by perpetuating the sexualization of children and allowing ground on which pedophiles can form community and reinforce eachothers delusions (perhaps emboldening eachother to the point of eventually acting on their fantasies).

    Just because there’s not an immediate victim does not mean it’s healthy behavior. Instead of putting time and effort into catering to incels, people should be disrupting their incel communities and making it more difficult for them to hide from the truth, not easier.

    • @rufus@discuss.tchncs.de
      link
      fedilink
      3
      edit-2
      2 months ago

      I’ve heard that story before, but even that is an unfounded claim. There is currently no empirical evidence on whether or not that would prevent or encourage abuse of children. Or harm the people doing it. I too think there is reason to believe it harms the people themselves. But I wanted to point out that this is just anecdotal and an opinion. There is no substance to that claim as of now. And there are studies done on related topics. As far as I know more research needs to be done and it’s a complicated topic. And furthermore, it’s not the same as having AI girlfriends anyways.

      I’m not exactly an expert on the topic, but I’ve skimmed a few studies. I was mainly interested because of the regular efforts to introduce total surveillance to the internet. Every half a year someone says “would somebody please think of the children!” And it’s always emotional and sounds plausible… But lots or the pretend arguments are not backed by science. And concerning the surveillance, which is a slightly different topic, we also have contradicting evidence. But that has nothing to do with this…

    • @Kit
      link
      22 months ago

      How does incel=pedophile in your example?

      • @pavnilschanda@lemmy.worldOPM
        link
        fedilink
        12 months ago

        I think they were making an equivalence to illustrate that enforcing through fantasy would encourage doing the same thing in real life, a topic with its own nuances

        • @rufus@discuss.tchncs.de
          link
          fedilink
          3
          edit-2
          2 months ago

          It’s a knock-out argument / thought-terminating cliché. You draw (false) analogies to either pedophiles or nazis if you’re out of proper arguments. It has a long tradition on the internet 😉

        • @Kit
          link
          12 months ago

          So when an incel gets an AI girlfriend it will help them get an irl gf?