BDSM, LGBTQ+, and sugar dating apps have been found exposing users’ private images, with some of them even leaking photos shared in private messages.

  • Balder@lemmy.world
    link
    fedilink
    English
    arrow-up
    113
    ·
    2 days ago

    Brace yourselves, because this is only going to get worse with the current “vibe coding” trend.

      • Vendetta9076@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        79
        ·
        2 days ago

        Vibe coding is the current trend of having an LLM build your codebase for you then shipping it without attempting to understand the codebase.

        Most developers are using LLMS to some extent to speed up their coding, as cursor and Claude are really good at removing toil. But vibe coders have the LLM build the entire thing and don’t even know how it works.

        • Elle@lemmy.world
          link
          fedilink
          English
          arrow-up
          43
          ·
          2 days ago

          In other words, vibe coders are today’s technologically accelerated script kiddie.

          That’s arguably worse as the produced scripts may largely work and come with even less demand for understanding than a script kid’s cobbling together of code may have demanded.

          • TeddE@lemmy.world
            link
            fedilink
            English
            arrow-up
            19
            ·
            2 days ago

            Large language models (LLM) are the product of neural networks, a relatively recent innovation in the field of computer intelligence.

            Since these systems are surprisingly adept at producing natural sounding language, and is good at create answers that sound correct (and sometimes actually happen to be) marketers have seized on this as an innovation, called it AI (a term with a complicated history), and have started slapping it onto every product.

          • qaz@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 days ago

            A machine learning model that can generate text.

            It works by converting pieces of text to “tokens” which are mapped to numbers in a way that reflects their association with other pieces of text. The model is fed input tokens and predicts tokens based on that, which are then converted to text.

          • qyron@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 days ago

            Large Language Model

            To the extent of my understanding, it is a form of slightly more sophisticated bot, as in an automated response algorithm, that is developed over a set of data, in order to have it “understand” the mechanics that make such set cohesive to us humans.

            With such background, it is supposed to produce new similar outputs if given new raw data sets to run through the mechanics it acquired during development.

          • spooky2092
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 days ago

            Boring/repetitive work. For example, I regularly use an AI coding assistant to block our basic loop templates with variables filled in, or have it quickly finish the multiple case statements or assigning values to an object with a bunch of properties.

            In little things like that, it’s great. But once you get past a medium sized function, it goes off the rails. I’ve had it make up parameters in stock library functions based on what I asked it for.

      • Little8Lost@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        Its going to be 1GB node_modules handled by garbage ai code
        ai is only good at doing smaller scripts but loosing connections and understandment in larger codebases, combined with people who cant program well (i mean not only coding but debugging… as well) also called vibe programmers its going to be a mess

        if a product claims it has vibecoding: find an alternative!

        • msage@programming.dev
          link
          fedilink
          English
          arrow-up
          9
          ·
          2 days ago

          I’m losing my will to live lately at an alarming rate.

          I used to love IT, way back at the start of 00s.

          Soon after the 10s started, I noticed bullshit trends replacing one another… like crypto or clouds or SaaS… but now with the AI I just feel alienated. Like we’re just all going to hell, and I hate the first row seating.

          • Balder@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            At this point, I think it’s required to have a sort of alternate identity online and keeping anything private, photos of yourself and other information just offline. Except for government stuff, which requires your real identity.

            • msage@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              I mean yeah, I selfhost everything, but I hate that i have to learn and support the most useless shit ever just to earn a living.

              It used to be fun being a dev, now I’m just repeating the same warning phrases about technologies.

  • MissGutsy
    link
    fedilink
    English
    arrow-up
    164
    ·
    edit-2
    2 days ago

    Cybernews researchers have found that BDSM People, CHICA, TRANSLOVE, PINK, and BRISH apps had publicly accessible secrets published together with the apps’ code.

    All of the affected apps are developed by M.A.D Mobile Apps Developers Limited. Their identical architecture explains why the same type of sensitive data was exposed.

    What secrets were leaked?

    • API Key
    • Client ID
    • Google App ID
    • Project ID
    • Reversed Client ID
    • Storage Bucket
    • GAD Application Identifier
    • Database URL

    […] threat actors can easily abuse them to gain access to systems. In this case, the most dangerous of leaked secrets granted access to user photos located in Google Cloud Storage buckets, which had no passwords set up.

    In total, nearly 1.5 million user-uploaded images, including profile photos, public posts, profile verification images, photos removed for rule violations, and private photos sent through direct messages, were left publicly accessible to anyone.

    So the devs were inexperienced in secure architectures and put a bunch of stuff on the client which should probably have been on the server side. This leaves anyone open to just use their API to access every picture they have on their servers. They then made multiple dating apps with this faulty infrastructure by copy-pasting it everywhere.

    I hope they are registered in a country with strong data privacy laws, so they have to feel the consequences of their mismanagement

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 days ago

        No, it’s lack of experience. When I was a junior dev, I had a hard enough time understanding how things worked, much less understanding how they could be compromised by an attacker.

        Junior devs need senior devs to learn that kind of stuff.

        • PumaStoleMyBluff@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          It does help if services that generate or store secrets and keys display a large warning that they should be kept secret, every time they’re viewed, no matter the experience level of the viewer. But yeah understanding why and how isn’t something that should be assumed for new devs.

    • taiyang@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 days ago

      I’ve met the type who run businesses like that, and they likely do deserve punishment for it. My own experience involved someone running gray legality betting apps, and the owner was a cheapskate who got unpaid interns and filipino outsourced work to build their app. Guy didn’t even pay 'em sometimes.

      Granted, you could also hire inexperienced people if you’re a good person with no financial investor, but that I’ve mostly seen with education apps and other low profit endeavors. Sex stuff definitely is someone trying to score cash.

    • azalty@jlai.lu
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      The illusion of choice

      A lot of “normal” dating apps are also owned by the same companies

    • Rexios@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Every single one of those “secrets” is publicly available information for every single Firebase project. The real issue is the developers didn’t have proper access control checks.

  • CheeseToastie@lazysoci.al
    link
    fedilink
    English
    arrow-up
    68
    ·
    2 days ago

    This is devastating. The LGBT community are often hiding their true selves because of family, colleagues, culture etc. People will be destroyed.

  • PumaStoleMyBluff@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    2 days ago

    Anyone who uses Grindr, please be aware that any photos you send are cached and stored unencrypted in plain old folders on the receiver’s phone, regardless of whether they were expiring or in an album that you later revoked. It’s nearly trivial to grab any photo someone sends you, with no watermark or screenshot notification.

  • azalty@jlai.lu
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    2 days ago

    Use Signal or SimpleX for more private stuff like this 👀

  • thatradomguy@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    Just don’t send nudes… why do people think other people won’t figure out how to screenshot or just keep photos forever? Even if you trust the person, the person could get hacked… the pwned guy got pwned for Jehova’s sake. Just stop sending that shit.