• RobotToaster@mander.xyz
    link
    fedilink
    English
    arrow-up
    97
    ·
    7 months ago

    Doesn’t instagram claim messages are e2e encrypted? How can this work without them having access to all messages?

  • RmDebArc_5@lemmy.ml
    link
    fedilink
    English
    arrow-up
    68
    ·
    edit-2
    7 months ago

    One question. If they know those are minors and that they know that the pictures are nudes, why the hell don’t they just ban the accounts that try to send nudes to minors? Also who the hell thinks it is a good idea to send nudes to Meta?

    • Quik@infosec.pub
      link
      fedilink
      English
      arrow-up
      47
      ·
      7 months ago

      I would suspect because there is probably space for errors in the detection system

    • TimeSquirrel@kbin.social
      link
      fedilink
      arrow-up
      31
      ·
      7 months ago

      Also who the hell thinks it is a good idea to send nudes to Meta?

      It was eye-opening when I realized I’m the only one in my circle who gives a shit about online privacy. You and me and most of the Fediverse are a rare minority. This is normal to people now. If you told people in the 90s about this they’d rightfully call it a dystopia. I remember my mother being super paranoid about me going online back then. Boiling frog situation here.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        Which is kinda funny because lemmy is really bad for privacy since pretty much everything is open. If you want to see how people vote, just make your own instance and collect it all.

        Lemmy is relatively anonymous, but not private. It’s still way better than anything Meta does.

    • andrew_bidlaw@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 months ago

      why the hell don’t they just ban the accounts that try to send nudes to minors?

      Because they are sent from minors to minors too? Teens are horny, they copy adults making nudes, sometimes just sharing porn. Recently there were problems with classmates using pornLLM to undress their peers. The abuse problem is harsher, but I feel it’s the minority of nudes received by minors. Honestly, I’d have changed the EULA to forbid it on a public service like Insta, because unlike messengers there is everything to be deanonymized and explicitly targeted by an abuser, including stalking and threats IRL. For Insta, there could be a rule to ban uploading images to Direct of <18 y.o. users, only reposts, meaning they are publically availiable and may get reported by other users and brought down by existing policies without breaking E2EE.

      • RmDebArc_5@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        7 months ago

        Because they are sent from minors to minors too?

        This could be different depending on the country, but in Germany that would still be illegal. I don’t think a rule like you suggest would ever happen if not forced by law

        • andrew_bidlaw@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 months ago

          I haven’t heard of that law be strictly enforced tho. For one reason - teens are stupid and don’t know laws, even though they fall under them. But yeah, most civilized places have laws against production of minor porn that doesn’t specify age, but can walk around the problem if it’s produced by a consenting party, of themselves, and without a big age difference.

          • Quik@infosec.pub
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            The “problem” here (if you think so) is that if law enforcement in Germany gets to know about a case like this, they cannot choose themselves not to act on it.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 months ago

      Because not everyone lives in Saudi-Arabia or Texas a country dominated by religious conservatives?

      ETA: I’m sorry. I shouldn’t have made it about specific countries.

        • Borkdornsorkpor@lemmy.ml
          link
          fedilink
          English
          arrow-up
          10
          ·
          7 months ago

          If an 18 year old girl sends a spicy picture to her 17 year old boyfriend, that probably shouldn’t warrant the same reaction as a 15 year old receiving porn in her dms from some 40 year old stranger. Black and white thinking typically doesn’t lead to fair policies and implying that someone wants to creep on random children because they recognize that nuance exists seems a little disingenuous.

          • General_Effort@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 months ago

            If an 18 year old girl sends a spicy picture to her 17 year old boyfriend

            If the 17-year-old sends one back, that’s already a crime in many places. It could get him prosecuted for production and her for possession. If he uploads the picture to a chat group, the real fun starts.


            The article says this is about nudity, not selfies. IDK how kids use instagram, but I’d guess that the senders are mainly kids sharing horny pics from somewhere else.

    • Liz@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      From an ethical standpoint I would say teenagers should be allowed to send each other nudes, but from a corporate liability standpoint I don’t wanna have anything to do with that.

  • Red_October@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    ·
    7 months ago

    So… they can identify when someone in a conversation is a minor. And they can identify when nudes are being sent. But when these two are combined, they figure just blurring the image is the appropriate solution?

    • UraniumBlazer@lemm.ee
      link
      fedilink
      English
      arrow-up
      28
      ·
      edit-2
      7 months ago

      Perhaps to avoid false positives? I think it’s telling the minor, “hey, this might be a dick. Open only if you trust the person”.

  • EvilBit@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    ·
    7 months ago

    Yeah, this is definitely gonna work, as if I haven’t been over 18 years old since I was 12 years old, according to every birthdate question ever.

  • BearOfaTime@lemm.ee
    link
    fedilink
    English
    arrow-up
    32
    ·
    7 months ago

    Wouldn’t not permitting minors to use the service at all make this issue moot?

  • qprimed@lemmy.ml
    link
    fedilink
    English
    arrow-up
    25
    ·
    7 months ago

    lots of comments about e2e encryption (or the potential lack thereof)

    even if it is e2e encrypted (and I mostly believe it is), once its decrypted on your device (in their app) its in the clear. there is nothing technical preventing the app from then inspecting the data or forwardiing the data to another party for analysis - thats a “terms and conditions” issue.

    the article claims they are doing some on-device recognition - thats likely computationally non-trivial, with variable accuracy (false positives/negatives, anyone) and probably at least partially circumventable and perhaps even exploitable (more app surface area to attack).

    so, ok… its a lead-in to classifying content on your device. I have no idea what comes next, but I am pretty sure there will be a next and this is why I don’t intentially use any meta products.

    • Lutra@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      7 months ago

      Which is a end-game around E2E. Saying ‘the message is encrypted’, but yes, I look at all messages before and/or after violates the expectation of E2E.

      • BearOfaTime@lemm.ee
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 months ago

        I’ve said this from the start, and people called me names, or “prove it”. Sigh.

        If the capability is there, that’s a problem.

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    7
    ·
    7 months ago

    Honestly seems like a healthy feature. Everything is supposedly on-device, so it’s not like the AI police are banning anything, just smartly giving tools and advice to vulnerable people.

  • h_ramus@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    What about images sent from Japan? Aren’t they all pixelated by default? /s

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Daddy, what’s noodity? Oh! So peepee pictures is nudity? How come I can’t see all the peepee in my phone?

    Anyway, to prevent this conversation, maybe labele the images “content not appropriate or not allowed”. It works for mastodon. We literally can’t see a tit or dick unless we double click on the fussy image. So why only minors? Just add a switch for everyone.