It not only have problems with CSEM (the real-life stuff), but there are now bots spamming it and Twitter made reporting it a chore.

“Please provide more context”, WTF it’s literally just CSEM plus a link I won’t click even if my life depends on it!

I’m quite sad since a lot of the creators I’m following are still only there, or on the boneless fediverse app BlueSky (which is worse in some ways), and I still need to keep it around just to protect my user handle there and to look things up from time to time.

Once I’m at home from work, I’m locking my account, and put up a farewell message to whoever might miss me.

I’m not saying that the fediverse is perfect, far from it (especially certain segments of Lemmy), but it’s a way better experience than whatever Xitter (or Reddit for that matter) tries to be. I even have more reach, especially since the whole paid blue checkmark thing.

  • UraniumBlazer@lemm.ee
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 months ago

    I still didn’t get why we stopped calling it CP instead of CSAM or CSEM or whatever it is now.

    • thrawn@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      8 months ago

      Yeah same. I first remember hearing it when Apple was planning that amazingly invasive local scanning of user images. Now it seems to be everywhere.

      I’m not against it though. CP could’ve described multiple things and this one is a lot less mistakable when you know. CP wasn’t particularly intuitive either— no easier to decipher, merely that with years of use many people knew it— so it’s an upgrade overall I think.

      Another benefit is that it includes “abuse” in the name. That’s important and ensures the people who seek that stuff out won’t borrow the term like they did CP.

      • damon@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        8 months ago

        They stopped calling it CP. As Porn is typically consensual and is typically what people think of when they think of Porn, they’re not thinking it’s two legal persons where one has not consented. Thus to avoid any confusion or possibility of downplaying its severity they changed the term from CP.

      • Ashe
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        I’ve also seen it used as niche industry abbreviations, which made me very uncomfortable at first, regardless of it being disused

    • ZILtoid1991@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      I think it’s part because of definition of porn, and pedos were like “but what about erotica?”, and there were CSEM that used such clauses to get around bans, essentially by claiming they’re artistic nudes.

    • Melmi
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      8 months ago

      CSAM is supposed to be more explicit that the images are essentially crime scene photographs, and to emphasize that it is Abuse first and foremost and not merely pornography.

      CP is a morally neutral term, or at least the components words themselves are. CSAM is not, and is explicitly negative.

      • UraniumBlazer@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        Hmm… I mean I’m not challenging this explanation, but I’m just a little curious about this I suppose? So starting from when I was like 13-14, I regularly sent and received nudes of other people my age I met on gay forums n shit. Uk… Sexting n stuff. Now I know that this could’ve gone incredibly ugly had I been deanonymized n stuff. But I mean… I had fun at the time and am in contact (not that regular tho) with some of these guys (and I’m an adult now).

        I had fun at the time and was not coerced into anything by anyone. I was just a horny teen with an out and so were they. How’s this abuse? Like who’s the abuser? I’m sure it wasn’t us, as no one coerced anyone into doing anything.

        • Melmi
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          That’s I guess why CSEM is used, because if the images are being shared around exploitation has clearly occurred. I can see where you’re coming from though.

          What I will say is that there are some weird laws around it, and there have even been cases where kids have been convicted of producing child pornography… of themselves. It’s a bizarre situation. If anything, seems like abuse of the court system at that point.

          Luckily a lot of places have been patching the holes in their laws.