There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings’ innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone’s face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

  • Veraticus@lib.lgbt
    link
    fedilink
    English
    arrow-up
    176
    ·
    edit-2
    1 year ago

    I’m only going to answer the first part of your question, not the AI/generated part.

    No one really chooses what or who they’re attracted to; it kind of just happens to you. For example, you might be watching a TV show and someone gets lightly, comically spanked… and suddenly a light bulb goes off above your head and you think, “whoa, that might actually be kinda fun.” People are wired in ways we don’t understand to want things we don’t even know we want.

    To that extent, pedophiles are themselves victims of their own desires; there’s no “logic” behind it. It’s simply an urge they experience.

    Of course that doesn’t make succumbing to this urge excusable, and any children who are impacted are of course victims and the pedophiles, predators. But no one is training pedophiles in pedophile camp. It’s just humans being human, unfortunately.

      • CaptainEffort@sh.itjust.works
        link
        fedilink
        arrow-up
        35
        ·
        1 year ago

        Unfortunately I think it’s probably in the same vein as any fetish or preference, so completely out of their control.

        Obviously people who act on it are the scum of the earth, but those who simply battle with the urge I have nothing but sympathy for. I can’t even imagine how horrible it is to have to deal with that daily and never be able to do anything about it, or even really talk to anyone about it.

          • CaptainEffort@sh.itjust.works
            link
            fedilink
            arrow-up
            27
            ·
            edit-2
            1 year ago

            Acting on it is NEVER out of their control.

            As someone who doesn’t have to permanently stifle my desires for the entirety of my life, I’m not about to assume that. I have no idea the toll that could take on someone mentally.

    • Fredselfish@lemmy.world
      link
      fedilink
      arrow-up
      55
      ·
      1 year ago

      I have heard that kids that molest kids or end up fooling around a super young age can make them grow up wired to be attracted to young kids or teens.

      The real issue no one wants to address is people who have these desires and know it wrong have nowhere to turn to for help.

      Even If they haven’t abuse anyone come out tell a therapist or someone you are attracted to kids will and probably can get you locked up. There are those who never affend but a lot do because either a) they accept what they are and have no moral objections to it or b) can’t get the help needed to fight the urges and end up offending.

      As @Veraticus said there is no easy answer because it’s not a choice. Be like asking you why you like women or why people are gay. They wired that and unfortunately I don’t think you can cure it.

      We definitely need to address access to any kind porn of it and if someone offends we must lock them away for their own good. Not saying prison but somewhere they can be mentally evaluated.

      • Instigate@aussie.zone
        link
        fedilink
        arrow-up
        29
        ·
        1 year ago

        There is definitely a link between having experienced sexual abuse as a child without any therapy or counselling to help them make sense of it and then later on sexually abusing other children, but it’s not super clear-cut and definitely not predictable.

        • Fredselfish@lemmy.world
          link
          fedilink
          arrow-up
          18
          ·
          1 year ago

          Yes same with girls who are raped or molested become promiscuous but doesn’t mean all girls in that situation will. Definitely why we need better sex education in America so we can teach kids the signs if an adult being inappropriate and learn about their own bodies.

      • CeruleanRuin@lemmings.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        Acting on it is ALWAYS a choice. I don’t really give a fuck how a person is wired if they choose to exploit children in any way, including by possessing sexual imagery of kids.

        Therapists are reachable through a simple internet search. They’re not going to lock you up if you haven’t ACTED on it. Don’t give me that shit about “they couldn’t get treatment, so of course they had to look at kiddie porn” or tell me they have “nowhere to turn”. Bullshit.

        • Fredselfish@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 year ago

          There is countless records of just that and even If there wasn’t what do you think would happen to your life if you came out and said you are a pedophile?

          You think you keep your job, house. You think you be banned from living near a school? These people live in secret even the ones who don’t offend for reason.

          Therapy can’t do much not like there a cure. I don’t know your preferences but let me ask did you choose to be attracted that way? If yes then means you can choose to be attracted to the other sex. See your logic doesn’t hold up.

    • XbSuper@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      This is one of the most sane responses I’ve ever seen.

      I am one of those poor souls who has these urges, but has never, and will never, act on them.

      I’m willing to open myself to an AMA for anyone interested.

    • DingoBilly@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      This is the most accurate answer, and the fact it’s all cultural/social is quite important as well.

      If you were born a few thousand years ago it may be completely reasonable to sleep with a kid. Hell the kid is probably your slave so you could literally do whatever you want with them.

      But just as I don’t understand certain fetishes or even just people attracted to the same sex, others won’t understand why people would be attracted to kids.

    • Hjalmar@feddit.nu
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I listened to an amazing podcast about this a while ago. It was some science dude helping people not to be attracted to children. If somebody wants to have a listen I can probably find a link, but the podcast was in Swedish

  • Deestan@lemmy.world
    link
    fedilink
    arrow-up
    47
    ·
    edit-2
    1 year ago

    To answer some of the questions:

    I cannot understand the attraction to kids.

    There was a TV interview with people who were seeking help for pedophilia. They described it as just plain horny sexual attraction that they knew they had to not act on. I guess people have different reasons, and some probably manage to rationalize it as “relationships” as you say.

    Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

    Whether it is a modified image of a real person, or a pure generated picture, they will fall under the same laws as for depicting it which is already uncontoversially illegal.

    How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

    Hard, as there are many ways to describe nudity or encourage the generator to weigh towards nudity. “Person with visible thighs, no skirt” and such.

    Easier to leave nudity out of the training data, which is already common.

    Then hard again because anyone can throw together a new image generator trained on what they want and no word filters.

  • Dame @lemmy.ml
    link
    fedilink
    arrow-up
    36
    ·
    1 year ago

    I see people here attempting to equate it with a natural attraction and or a fetish. We all have internet access and can look it up. It is in the DSM-5 and ICD-10 both pedophilia and pedophilic disorder. Especially, in this more advanced age of medicine, science and society. If it was natural I believe the corrections would’ve been made and or strongly advocated for, it’s not. What is advocated is using terminology correctly, encouraging those that experience this to feel comfortable to tell their truth and seek help. I believe some of you guys comments are very dangerous and some of the upvotes and downvotes are concerning and makes it difficult to distinguish if you are in support of protecting children. The point is please don’t just blanket label it and compare it to things that aren’t harmful, illegal and consensual.

  • VelvetStorm@lemmy.world
    link
    fedilink
    arrow-up
    30
    ·
    edit-2
    1 year ago

    Edit: also don’t be sexist, call it what it is rape, not seduction. Just because it was a woman doesn’t make it not rape. Calling it anything else is doing a disservice to all of the male victims of female on male rape.

    If the AI porn is depicting real life underage people then it should be and is(in some places) illegal. Now I don’t like it and find it reprehensible but if it’s not depicting real-life people then it should be legal.

    I think if you are into that then you should be able to seek professional help without a fear of it ruining your professional and personal life, but if you are attracted to kids then it is your moral responsibility and obligation to not work with or be around kids.

    This is a mental issue and it should be treated like one and we should be trying to understand it and find ways to prevent and treat it.

    • SnausagesinaBlanket@lemmy.worldOP
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      I was reading this, and it made me remember how a dude in Australia I believe bought an underage sex doll. It ended up being flagged somehow, and the government arrested the guy when it arrived. I have no idea what happened to the guy.

      Was this guy trying to control his urges by using this hunk of rubber, or is that a crime too? This is very edgy, and I am thinking some countries won’t bother with them, while others might incur the death penalty.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Articulate and concise. I like it.

      I don’t really have anything more to add, except that we as a society need to make up our collective minds on cg and AI csam/CP soon.

      I definitely think that cg/AI content is less bad (still bad, but less so) because there’s no harm being done to real children from it; but those AI image engines needed to be trained on some form of content to be able to generate the images that they do, so I’m not sure how the training images factor in, or what would even be used for training that AI… I know rather little about how AI is trained at the moment, so I’m not sure if it can be done without source csam material or not… IMO, that factors into the morality of the output.

      I definitely agree that sexual preference toward minors (aka paedophilia) is just that, a sexual preference; and that, in and of itself does not make someone a sex offender in the same way that being heterosexual doesn’t make you a rapist or other form of sexual “deviant” (or however you want to say that. It’s interesting to me to think that paedophiles may have a semi-legal way of getting porn for themselves (which causes no harm to children). I feel a bit bad for paedophiles in that they’re basically forced to have relationships with persons that they don’t find very sexually attractive, else they break the law. Not bad enough that I think that laws should change our anything, it’s just a crap situation. It would be like having a preference towards men, as a man, in a world of heteros. The men are there and you’re interested in them, but none of them are interested in you. Almost always that’s not the case, there’s other homosexual men that exist, no matter how rarely… in the case with pedos, there are exactly zero underage people who they can interact with at all sexually. I still don’t think that should change, but at least with the internet, a gay man can go and find porn that interests them. With pedos is literally a crime to even look at, possess, or make any porn that appeals to them.

      I can sympathize with the impossibility of their situation, that’s all. For the record, I’m just done cis male with no interest in anyone too young to date. I can recognise their attractive qualities without being attracted to them (speaking mostly about those that have reached their sexual maturity here, who are still not 18 or whatever)… I can understand it, I’m not so hateful to want anyone who feels attraction to young people to die or anything, but young people don’t have the experience to understand the situation they’re getting into, when they’re being mislead or gas lit, etc (though to be fair, a lot of "mature people seem to not know either, but that’s another discussion)… Fact is, they’re shit out of luck.

      I’m sure many are forced into celibacy just to be lawful. I don’t think any grown adult wants to be forced to be celibate; so I can understand the plight. AI/cg porn, tailored to that specific preference may give pedos an outlet that they can utilize to temper their urges and keep them on the right side of the law here. Of course it won’t solve the problem entirely, the same way that rapists are still a thing, but it may severely reduce illegal activity and harm to children.

      But I agree, it’s a slippery slope (so to speak) because it can easily evolve into lowering the age of consent, and bringing back child marriages and such. Which IMO, isn’t a desired outcome. I also don’t think that content should intermingle with either social networks or existing porn sites, since it’s so specific, it should be relegated to specific sites and not left flapping around the internet. It’s also a vast minority of people that are afflicted, so segregation may be a minimum measure to keep things somewhat clean. I know I don’t want AI generated CP content mixed in with my usual porn browsing… I’m sure there’s plenty of people in the same boat, so IMO that’s a minimum. But I’m only one voice in the society, so I don’t make the decision; I’m interested to see what decision is finally made and implemented, whenever we get there.

      As a disclaimer: I’m not attracted to underage people. I’m also not a doctor or scientist, or psychologist or anything else. I’m not in favor of anything here, besides society making a decision, and I’m just positing that it could be beneficial to society as a whole. I welcome other opinions, except those by people whom are heavily religious. Good day.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I’ve heard of that genetic condition. It’s fascinating, and as far as I know, extremely rare.

          I know that at least one has spoken publicly about her experience, and they touched on dating and the implication was that most of the people that are interested, are paedophiles, and that didn’t sit well for her, and I expect that wouldn’t sit well for most people, especially those with that condition.

          Fascinating information all around. I don’t have a doubt that is accurate.

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              That was the take away. She was rather upset about it, which is apparently good for ratings but tragic overall.

              I suppose it depends on what she really wants in life, which I won’t presume to know. I wish her the best, that’s not a fun condition to deal with.

    • Chailles@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I don’t think that you can make a person into a pedophile, any more than you can make a person gay.

      If you really think about it, we’ve seen arguments like that before. That pornography creates rapists. That violent video games creates murderers. And that’s just strictly on the consumption of media.

  • Astroturfed@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    1 year ago

    Don’t try to understand it. You aren’t going to get a good answer. It’s a horrible mental illness level of sexual preference.

    Anything can be sexualized with enough impulse and experiences. Everyone’s got some weird dark fetish shit. Some of it’s illegal in practice. Normal people bury that shit or only discuss it in therapy. While talking it out so they can hopefully never think about it again.

    I’m sure there’s different answers to this just like “why are there serial killers?”. Just be glad it confuses you.

  • Izzgo@kbin.social
    link
    fedilink
    arrow-up
    24
    ·
    1 year ago

    To me there is a clear difference between children, and teens say 16+. It is both morally wrong and unnatural to be attracted to prepubescent children, and this is pedophilia. But basically, by definition puberty makes people become sexually attractive, and it’s natural for adults to be attracted. Still morally wrong to act on those attractions unless you’re in about the same stage of puberty or early adulthood. That’s when we rely on a strong moral code and laws in society to protect youngsters who have recently gone through puberty. And hopefully even after the laws no longer apply, we have enough societal pressure to strongly discourage wide age gaps between sexual partners.

    Pedophilic disorder is characterized by recurring, intense sexually arousing fantasies, urges, or behavior involving children (usually 13 years old or younger).

  • cooopsspace@infosec.pub
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    1 year ago

    At the end of the day, art is just pixels on a flat surface. Determining whether a depicted individual is under age where it’s not obvious sets a dangerous precedent. Is the picture 17 or 18? Who knows.

    But the problem is that people have been sexualising people like Emma Watson since she first appeared on screen. That’s not okay and rather than sending AI art underground I think society needs to change to normalise education about sex, reproduction and genitalia and address the social issues to treat pedophilia like the disease that it is.

    Meanwhile pedophile names are being written about publicly, risking mob violence and further isolation. Not to mention in the US theres a lot of negative attention being put on women’s reproduction, childrens sex ed and genitalia and a push to make the whole lot illegal and taboo. Not to mention people teaching their kids pet names for their parts, “uncle Ben touched my heehaw” sounds a lot different to “uncle Ben touched my penis”.

    Society is a problem, the US particularly is going in the wrong direction on many aspects of sex education.

  • Modern_medicine_isnt@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    1 year ago

    My assumption on the attraction thing is that there are many things that cause attraction. Guys generally go for younger women. Well what do younger women have, tighter bodies, firmer breasts, more fit, healthier looking hair and skin, more defined hips… but as we all know many guys specialize in being attracted to a few of these things. Well, children usualy have healthy skin and hair… so if a guy is attracted to just the attributes that don’t require puberty, I can imagine that attraction wise he might not feel such a difference. Now mix that in with wanting to feel superior and some of the other things like that and kids start to fit well. Now add in a high libido and low self control. Disaster. Take the same guy and add in an attraction to big boobs and he is close to average cause kids don’t have those.
    Basically, it only takes a few missing screws. As for women. Teenage boys have a lot of sexual energy and passion. I can imagine that being attractive. Plus there is of course the taboo of it that appeals to women just like any other kink. Put them in a space where they aren’t getting thier needs met by men, and give them access to boy. Disaster again. In the end, the diversity of humans means there will always be someone into anything you can imagine.

  • Dame @lemmy.ml
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    AI CSAM should absolutely be treated as such. The model has been trained on images of real human children. I’m not sure where the issue comes from I would imagine power. Id need to check peer reviewed work from those in the field but I honestly can’t stomach it.

          • Vedlt@lemmy.world
            link
            fedilink
            arrow-up
            9
            ·
            1 year ago

            I am not an expert in any field relating to any of this by any means, but we can all agree that CSAM is unequivocally reprehensible. Thusly many people will have severe issues with anything that normalizes it even remotely. That would be my knee jerk response anyway.

            • CaptainEffort@sh.itjust.works
              link
              fedilink
              arrow-up
              12
              ·
              edit-2
              1 year ago

              Well maybe we shouldn’t base our decisions on knee jerk responses.

              Imo if nobody’s being hurt then it’s none of our business. If it helps these people to deal with their urges without actually hurting anyone then I think that’s unquestionably a good thing.

              • Slowy@lemmy.world
                link
                fedilink
                arrow-up
                5
                ·
                1 year ago

                If it is in fact helping them, yes. It would be ideal to do a study of how it affects their self control before going that direction though I think, as some argue it would do the opposite.

                • CaptainEffort@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  10
                  ·
                  1 year ago

                  If it is in fact helping them, yes

                  Okay so… we agree?

                  And yes, some would argue the opposite. But I don’t think we should be creating laws without any actual proof one way or the other.

                • CeruleanRuin@lemmings.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  edit-2
                  1 year ago

                  It almost certainly “helps” as many of these people as it encourages. The hedonistic effect is a phenomenon common to all humans, where a person indulging heavily in something that makes them feel good needs more and more extreme examples of it to maintain the baseline of satisfaction from it. Any harmful compulsion when indulged will fall victim to this effect.

                  Providing virtual explicit images of children might mollify some, but it will have an inflaming effect on just as many others, who will seek out increasingly realistic or visceral imagery, up to and including looking for real photos and/or exploiting real children. That in turn ensures a market for child exploitation.

                  So no, it’s not harmless. Not remotely.

        • surewhynotlem@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          Yes, but it’s wrong for very different reasons and severities. Murder vs murder porn, if you will. Both are bad and gross, but different, and that matters.

          But that’s irrelevant to my question, which no one actually answered.

          I am curious about people’s take on the difference between human creativity from memory vs AI “creativity” from training. The porn aspect is only relevant in that it’s an edge case that makes the debate meaningful.

          There are laws today that you can’t copyright AI art, but we can copyright art that’s based on a person’s combined experiences. That seems arbitrary to me, and I’m trying to understand better.

      • Maeve@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        It used to be nothing for parents to take pictures of their kids playing in the bath. Parents have been convicted and lost their children for it, though.

      • Dame @lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        If the artist is drawing naked children that isn’t for the sake of a book or something of similar nature there is a problem. This is also a disingenuous comparison an artist hasn’t been trained on hundreds to millions of children’s images and then fine tuned. There’s a lot of illegal content these models come across and then are hopefully tuned by human hands. So try another example

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    I imagine there’s some kind of vampiric quality to it. Kids are full of youth and innocence: things we are all constantly losing to time. Especially if someone’s own childhood was robbed from them, I think they will carry around a void they desperately want to fill but never can, because of course abusing a child doesn’t bring these things back to you. Still, many child abuse victims go on to abuse other children later in life, and this may be their drive: to seek the thing they lost. It’s beyond sad. Abusing children is straight up disgusting and terrible, but the convoluted desperation that causes people to do it is truly horrifying in a stranger-than-any-fiction kind of way.

  • Apepollo11@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    1 year ago

    I’m only going to tackle the tech side of this…

    How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

    Easy. The most popular apps all filter for keywords, and I know that at least some then check the output against certain blacklisted criteria to make sure it hasn’t let something slip through.

    But…

    Anyone can host their own version and disable these features, allowing them to generate whatever they want, in the exactly same way that anyone can write their own story containing whatever they want. All you need is the determination to do it, and some modicum of ability.

    People have been been creating dodgy doctored photos long before computers. When Photoshop came out, it became easier, and with AI it’s easier still. The current laws about creating and distributing indecent images still apply to these new images though.

    • Adalast@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Technically the diffusers all have the ability to filter any material from the actual outputs using a secondary CLIP analysis to see if it kicks out any keywords which indicate that a topic is in the image. From what I have seen, most AI generation sites use this method as it is more reliable for picking up on naughty outputs than prompt analysis. AI’s are horny, I play with it a lot. All you have to do is generate a woman on the beach and about 20% of them will be at least topless. Now, “woman on the beach” should not he flagged as inappropriate, and I don’t believe the outputs should either because our demonization of the female nipple is an asinine holdover from a bunch of religious outcasts from Europe who were chased our for being TOO restrictive and prudish, but alas, we are stuck with it.

        • Adalast@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          You are correct, CLIP can misinterpret things, which is where human intelligence comes in. Having CLIP process the probabilities for the terminology that you describe what you are looking for then utilizing a bit of heuristics can go a long way. You don’t need to train it to recognize a nude child because it has been trained to recognize a child, and it has been trained to recognize nudity, so if an image scores high in “nude” and “child” just throw it out. Granted, it might be a picture of a woman breastfeeding while a toddler looks on, which is inherently not child pornography, but unless that is the specific image that is being prompted for, it is not that big of a deal to just toss it. We understand the conceptual linking so we can set the threshold parameters and adjust as needed.

          As for the companies, it is a tough world surrounding it. The argument of a company that produced a piece of software being culpable for the misuse of said software is a very tenuous one. There have been attempts to make gun manufacturers liable for gun deaths (especially handguns since they really only have the purpose of killing humans). This one I can see, as the firearm killing a person is not a “misuse”, indeed, it is the express purpose for it’s creation. But what this would be would be more akin to wanting to hold Adobe liable for the child pornography that is edited in Lightroom, or Dropbox liable for someone using Dropbox API to set up a private distribution network for illicit materials. In reality, as long as the company did not design a product with the illegal activity expressly in mind, then they really shouldn’t be culpable for how people use it once it is in the wild.

          I do feel like more needs to be done to make public the training data for public inspection, as well as forensic methods for interrogating the end products to figure out if they are lying and hiding materials that were used for training. That is just a general issue though that covers many of the ethical and legal issues surrounding AI training.

  • BastianAI@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    AI generated CSAM and CP should be treated the same, because you can’t generate anything without first having trained the model on the kind of content you want to generate.

  • Wugmeister@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    There are two parts to this problem.

    For kids who haven’t hit puberty, there is a diagnosable pedophilia disorder. This is mostly genetics. (I’m pretty sure I’ve met an alpaca that was a pedophile once.) The molester’s brain is wired wrong. Nothing to do about that. IMHO, they deserve pity as long as they keep their hands off the children.

    For teenagers, the attraction is the power dynamic. Teens have a rather distorted view on what is attractive, and they tend to be naive and easily manipulated. On top of this, almost all teenagers have next to no impulse control, and many will make very very bad decisions (even knowing that the decision is bad) if doing so might result in some form of dopamine hit via sex/adrenaline rush/video games/peer approval/etc. Adults that seek out teenagers for sexual relationships are bad people who chose to be a groomer. There is no genetic component to being a groomer, and they don’t deserve pity.

    Btw, I can flesh out my claim about the alpaca if you want, but it will have to have a tw for adorable fluffy animals suffering a horrifically slow and painful death.

    • Adalast@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Info link: https://pubmed.ncbi.nlm.nih.gov/18686026/

      The DSM-V specifies 2 types of pedophilia, Pedophilic (victim age <11) and Hebophilic (victim ages 11-14). What you are describing for the grooming is generally not pedophilia because “children” older than 15 are generally considered post-pubescent and thus anatomically adults. Their frontal lobes still have a LOT of time needed to cook to completion, but they have the impulse control issues for a reason, from an evolutionary standpoint. Yes, in modern society, “adults” who take advantage of the still-developing prefrontal cortex of a post-pubescent adolescent is a shit human being who doesn’t deserve to be a member of society, but they are technically not pedophiles, at least not clinically. Legally is a different story, but that is not a pertinent area of discussion right now.

      Pedophilic and Hebophilic individuals generally do not ever take their impulses to the realm of reality. Most of them actually end up feeling so much shame and remorse over even having the thoughts that they commit suicide. They definitely deserve pity and treatment, not stigmatization and ostracization.

      As to the OP asking about AI art that depicts underage individuals in states of undress or sexual situations, ALL depictions of underage individuals in those contexts are illegal. By the letter of the law, if you draw stick figures on a piece of paper having sex, then label them as children, you have created child pornography. No depiction is legal, no matter the medium. AI-generated, hand drawn, sculpted, watercolors, photos, under the law in (I believe) every state, they are all identical. Personally, I believe that this is asinine and 100% indicates that the purpose of these laws are to adjudicate morality, not “protect the children” as all of the people who push on them claim, but that is just my opinion. Hand-drawn artwork that has no photographic source material and does not depict real people has virtually 0 chance of having caused harm to any children, and AI just knows what the keywords mean in the context of reversing the vaporization of an image. They weren’t trained on kiddy porn, the we’re trained on pictures of children, and pictures of adults doing their porny thing, so they are able to synthesize the two concepts together.

  • Ganbat@lemmyonline.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

    That’s where things get difficult. An episode of Law & Order: SVU tried to tackle this question a long time ago (but with Photoshopped fake CSAM) and the answer was a resounding “I dunno.”

    On the one hand, it’s disgusting, deplorable, etc. On the other, a fake image means no one was victimized for it.

    Does the content further radicalize these people, creating further risk of them victimizing a child, or does it sate their desires, helping to prevent them from victimizing a child? These questions are incredibly difficult to actually answer, and no answer can ever really be definitive, as you can’t really predict how any one person might react.