• Crow
    link
    fedilink
    arrow-up
    167
    ·
    8 months ago

    I think the most problematic thing about this is that you give very private and intimate information of not just you but also your ex out, most definitely without their consent

      • neoman4426@fedia.io
        link
        fedilink
        arrow-up
        64
        ·
        8 months ago

        Stalking would be one thing, reads to me like the idea is training a language model on the ex’s texts to create an AI partner that acts like them so you can pretend nothing happened, which while probably safer for the other person than being stalked (as long as the training texts are never leaked which is a big if) is just sad

        • OrnateLuna
          link
          fedilink
          arrow-up
          35
          ·
          8 months ago

          Now way that would negatively impact the person using that app

          • XTL@sopuli.xyz
            link
            fedilink
            arrow-up
            26
            ·
            8 months ago

            A more optimistic view might be, that the user could use the tool to help deal with their feelings and questions and lack of closure in a way that doesn’t involve the ex or other people.

            Once they’ve had their fill of ranting or stages of grief and reached catharsis, or they’ve figured out their own feelings and views, the model and all data could be destroyed and no humans would have been harmed in the process.

            A suitably smart program might work as a disposable therapist of a sort. But that’s probably quite far away from current models.

            • Monument@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              6
              ·
              8 months ago

              I would wonder about the usefulness of it for relationships that weren’t primarily conducted via text for the whole duration of the relationship.

              Almost all of my relationships have a pattern in texting. There’s usually a month of deep and emotional conversations, followed by a few months of sexually explicit chatting as we spend more time together and work out the emotional conversations in person, and then logistical conversations once we’re emotionally and physically comfortable with each other. And sure, we’re kind and sweet to each other between the logistics, but I don’t know if that’s really enough to train an AI on, or even enough to properly represent ourselves or the relationship.

              I think that any AI would have to be very advanced to not respond to “Why did you leave me?” with ‘Because I had to take the dog to the vet/go get milk’ or whatever - especially when the bulk of the texts are in that latter relationship stage.

              • cm0002@lemmy.world
                link
                fedilink
                arrow-up
                4
                ·
                8 months ago

                I can confirm this, I have been backing up and restoring my messages in their entirety to every new phone since my very first Android phone (so about 15 years of messages), so all my relationships have complete texting records and I’ve seen that almost exact pattern

                Sidenote, the Google Messages app seems to handle an almost 6 GB mmssms.db file with grace lmao