A hacked database from AI companion site Muah.ai exposes peoples' particular kinks and fantasies they've asked their bot to engage in. It also shows many of them are trying to use the platform to generate child abuse material.
Ain’t that what are the tools there for. I mean I don’t like cp and I don’t want to engage in way with people who like it. But I use those llms to describe fantasies that I wouldn’t even talk about with other humans.
As long as they don’t do it on real humans nobody is hurt.
The problem with AI generated CP is that if they’re legal, it opens a new line of defense for actual CP. You would need to prove the content is not AI to convince real abusers. This is why it can’t be made legal, it needs to be prosecuted like real CP to be sure to convict actual abusers.
This is an incredibly itchy and complicated theme. So I will try not go go really further into it.
But prosecute what is essentially a work of fiction seems bad.
This it not even a topic new to the AI.
CP has been wildly represented in both written and graphical media. And the consensus in most free countries is not to prosecute those as they are a work of fiction.
I cannot think why an AI written CP fiction is different from human written CP fiction.
I suppose “AI big bad” justify it for some. But for me there should be a logical explanation behind if we would began to prosecute works of fiction why some will be prosecuted and why other will not. Specially when the one that’s being prosecuted is just regurgitating the human written stories about CP that are not being prosecuted nowadays.
I essentially think that a work of fiction should never be prosecuted to begin with, no matter the topic. And I also think that an AI writing about CP is no worse than an actual human doing the same thing.
I’m unfamiliar with the exact situation here, but when it comes to generative AI as I understand it, CP image output also means CP images in the training data.
That may not strictly be true, but it is certainly worth investigating at minimum.
I’m not claiming it’s legally simple but the difference is that this new “fiction” is very hard, if not impossible to distinguish from reality. Nowadays AI can form a regular human hand.
As long as they don’t do it on real humans nobody is hurt.
Living out the fantasy of having sex with children (or other harmful sexual practices and fantasies) wit AI or alike can strengthen the wish to actual do it in RL. It can weaken the strength to abstain. If you constantly have fantasies where for example “the child AI wanted it too” then it can desensitize you and making it harder and harder to push that thought aside when in a tempting situation. Instead of replacing the real thing with a fantasy you are preparing for the real thing. Some pedophiles already interpret children’s behavior as sexual that isn’t at all, but the AI might be told to act in that way and strengthen these beliefs.
This is still something that is and has to be studied more to fully understand it. Of course this is difficult because of the stigma. There might be differences between people who only are attracted to children and ones that are attracted to adults and children and there is just not enough data yet, but even the communities in which pedophiles who do not act on their attraction discuss coping strategies this is heavily discussed and controversial.
The problem with that argument is, that you can translate that to games, movies, books and basically everything. What if a person isn’t satisfied by killing people in pc games? What if they go into real life?
That argument is only valid for people who can’t differentiate between reality and fiction. And usually those people need medical help.
So, you imagine a world where friends of yours say things like “God, I want to kill people so badly. Fuck, I just wish society would let me.” And then what, they play Call of Duty until they climax?
If that’s how it is, god damn, maybe I do agree with Jack Thompson.
Wrong. People who allow these desires to fester, or as you suggest, actively seek out fulfillment for them, is not good for anyone. It’s not good for the pedophiles, because it will increase the need for fulfilling their illegal desires, and it won’t help kids, obviously because it emboldens pedophiles.
Have you ever experienced something you like, and said to yourself: “definitely not doing more next time.”
Ain’t that what are the tools there for. I mean I don’t like cp and I don’t want to engage in way with people who like it. But I use those llms to describe fantasies that I wouldn’t even talk about with other humans.
As long as they don’t do it on real humans nobody is hurt.
The problem with AI generated CP is that if they’re legal, it opens a new line of defense for actual CP. You would need to prove the content is not AI to convince real abusers. This is why it can’t be made legal, it needs to be prosecuted like real CP to be sure to convict actual abusers.
This is an incredibly itchy and complicated theme. So I will try not go go really further into it.
But prosecute what is essentially a work of fiction seems bad.
This it not even a topic new to the AI. CP has been wildly represented in both written and graphical media. And the consensus in most free countries is not to prosecute those as they are a work of fiction.
I cannot think why an AI written CP fiction is different from human written CP fiction.
I suppose “AI big bad” justify it for some. But for me there should be a logical explanation behind if we would began to prosecute works of fiction why some will be prosecuted and why other will not. Specially when the one that’s being prosecuted is just regurgitating the human written stories about CP that are not being prosecuted nowadays.
I essentially think that a work of fiction should never be prosecuted to begin with, no matter the topic. And I also think that an AI writing about CP is no worse than an actual human doing the same thing.
I’m unfamiliar with the exact situation here, but when it comes to generative AI as I understand it, CP image output also means CP images in the training data.
That may not strictly be true, but it is certainly worth investigating at minimum.
I’m not claiming it’s legally simple but the difference is that this new “fiction” is very hard, if not impossible to distinguish from reality. Nowadays AI can form a regular human hand.
Living out the fantasy of having sex with children (or other harmful sexual practices and fantasies) wit AI or alike can strengthen the wish to actual do it in RL. It can weaken the strength to abstain. If you constantly have fantasies where for example “the child AI wanted it too” then it can desensitize you and making it harder and harder to push that thought aside when in a tempting situation. Instead of replacing the real thing with a fantasy you are preparing for the real thing. Some pedophiles already interpret children’s behavior as sexual that isn’t at all, but the AI might be told to act in that way and strengthen these beliefs.
This is still something that is and has to be studied more to fully understand it. Of course this is difficult because of the stigma. There might be differences between people who only are attracted to children and ones that are attracted to adults and children and there is just not enough data yet, but even the communities in which pedophiles who do not act on their attraction discuss coping strategies this is heavily discussed and controversial.
If you are interested in the subject a bit more, this is a start: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8419289/
Yeah that link is a no for me boss
Maybe the next step to actually life these fantasies are so close they still become a problem because the hand isnt satisfying enough to use?
The problem with that argument is, that you can translate that to games, movies, books and basically everything. What if a person isn’t satisfied by killing people in pc games? What if they go into real life?
That argument is only valid for people who can’t differentiate between reality and fiction. And usually those people need medical help.
So, you imagine a world where friends of yours say things like “God, I want to kill people so badly. Fuck, I just wish society would let me.” And then what, they play Call of Duty until they climax?
If that’s how it is, god damn, maybe I do agree with Jack Thompson.
People who already have a desire for the real thing usually won’t be satisfied by pc games or whatever.
The point is, that usually fiction doesn’t fuel desires to be tested out in real life
Exactly correct.
And, what desire is it that 6-year-old-AI enjoyers have again? I guess the 6-years-old part is incidental?
Wrong. People who allow these desires to fester, or as you suggest, actively seek out fulfillment for them, is not good for anyone. It’s not good for the pedophiles, because it will increase the need for fulfilling their illegal desires, and it won’t help kids, obviously because it emboldens pedophiles.
Have you ever experienced something you like, and said to yourself: “definitely not doing more next time.”