He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
I had an idea when these first AI image generators started gaining traction. Flood the CSAM market with AI generated images( good enough that you can’t tell them apart.) In theory this would put the actual creators of CSAM out of business, thus saving a lot of children from the trauma.
Most people down vote the idea on their gut reaction tho.
It’s such an emotional topic that people lose all rationale.
I remember the Reddit arguments in the comment sections about pedos, already equalizing the term with actual child rapists, while others would argue to differentiate because the former didn’t do anything wrong and shouldn’t be stigmatized for what’s going on in their heads but rather offered help to cope with it. The replies are typically accusations of those people making excuses for actual sexual abusers.
I always had the standpoint that I do not really care about people’s fictional content. Be it lolis, torture, gore, or whatever other weird shit. If people are busy & getting their kicks from fictional stuff then I see that as better than using actual real life material, or even getting some hands on experiences, which all would involve actual real victims.
And I think that should be generally the goal here, no? Be it pedos, sadists, sociopaths, whatever. In the end it should be not about them, but saving potential victims. But people rather throw around accusations and become all hysterical to paint themselves sitting on their moral high horse (ironically typically also calling for things like executions or castrations).
My concern is why would it put them out of business? If we just look at legal porn there is already beyond huge amounts already created, and the market is still there for new content to be created constantly. AI porn hasn’t noticeably decreased the amount produced.
Really flooding the market with CSAM makes it easier to consume and may end up INCREASING the amount of people trying to get CSAM. That could end up encouraging more to be produced.
I had an idea when these first AI image generators started gaining traction. Flood the CSAM market with AI generated images( good enough that you can’t tell them apart.) In theory this would put the actual creators of CSAM out of business, thus saving a lot of children from the trauma.
Most people down vote the idea on their gut reaction tho.
Looks like they might do it on their own.
It’s such an emotional topic that people lose all rationale. I remember the Reddit arguments in the comment sections about pedos, already equalizing the term with actual child rapists, while others would argue to differentiate because the former didn’t do anything wrong and shouldn’t be stigmatized for what’s going on in their heads but rather offered help to cope with it. The replies are typically accusations of those people making excuses for actual sexual abusers.
I always had the standpoint that I do not really care about people’s fictional content. Be it lolis, torture, gore, or whatever other weird shit. If people are busy & getting their kicks from fictional stuff then I see that as better than using actual real life material, or even getting some hands on experiences, which all would involve actual real victims.
And I think that should be generally the goal here, no? Be it pedos, sadists, sociopaths, whatever. In the end it should be not about them, but saving potential victims. But people rather throw around accusations and become all hysterical to paint themselves sitting on their moral high horse (ironically typically also calling for things like executions or castrations).
Yeah, exact same feelings here. If there is no victim then who exactly is harmed?
My concern is why would it put them out of business? If we just look at legal porn there is already beyond huge amounts already created, and the market is still there for new content to be created constantly. AI porn hasn’t noticeably decreased the amount produced.
Really flooding the market with CSAM makes it easier to consume and may end up INCREASING the amount of people trying to get CSAM. That could end up encouraging more to be produced.
The market is slightly different tho. Most CSAM is images, with Porn theres a lot of video and images.
It’s also a victimless crime. Just like flooding the market with fake rhino horns and dropping the market price to a point that it isn’t worth it.