It not only have problems with CSEM (the real-life stuff), but there are now bots spamming it and Twitter made reporting it a chore.

“Please provide more context”, WTF it’s literally just CSEM plus a link I won’t click even if my life depends on it!

I’m quite sad since a lot of the creators I’m following are still only there, or on the boneless fediverse app BlueSky (which is worse in some ways), and I still need to keep it around just to protect my user handle there and to look things up from time to time.

Once I’m at home from work, I’m locking my account, and put up a farewell message to whoever might miss me.

I’m not saying that the fediverse is perfect, far from it (especially certain segments of Lemmy), but it’s a way better experience than whatever Xitter (or Reddit for that matter) tries to be. I even have more reach, especially since the whole paid blue checkmark thing.

  • Melmi
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    7 months ago

    CSAM is supposed to be more explicit that the images are essentially crime scene photographs, and to emphasize that it is Abuse first and foremost and not merely pornography.

    CP is a morally neutral term, or at least the components words themselves are. CSAM is not, and is explicitly negative.

    • UraniumBlazer@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      Hmm… I mean I’m not challenging this explanation, but I’m just a little curious about this I suppose? So starting from when I was like 13-14, I regularly sent and received nudes of other people my age I met on gay forums n shit. Uk… Sexting n stuff. Now I know that this could’ve gone incredibly ugly had I been deanonymized n stuff. But I mean… I had fun at the time and am in contact (not that regular tho) with some of these guys (and I’m an adult now).

      I had fun at the time and was not coerced into anything by anyone. I was just a horny teen with an out and so were they. How’s this abuse? Like who’s the abuser? I’m sure it wasn’t us, as no one coerced anyone into doing anything.

      • Melmi
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        That’s I guess why CSEM is used, because if the images are being shared around exploitation has clearly occurred. I can see where you’re coming from though.

        What I will say is that there are some weird laws around it, and there have even been cases where kids have been convicted of producing child pornography… of themselves. It’s a bizarre situation. If anything, seems like abuse of the court system at that point.

        Luckily a lot of places have been patching the holes in their laws.