• backgroundcow@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    13 hours ago

    I very much understand wanting to have a say against our data being freely harvested for AI training. But this article’s call for a general opt-out of interacting with AI seems a bit regressive. Many aspects of this and other discussions about the “AI revolution” remind me about the Mitchell and Web skit on the start of the bronze age: https://youtu.be/nyu4u3VZYaQ

  • snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    117
    ·
    2 days ago

    I disagree with the base premise that being opt out needs to be a right. That implies that having data be harvested for companies to make profits should be the default.

    We should have the right to not have our data harvested by default. Requiring companies to have an opt in process with no coercion or other methods of making people feel obligated to opt in is our right.

    • ItsComplicated@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      44
      ·
      2 days ago

      being opt out needs to be a right. That implies that having data be harvested for companies to make profits should be the default.

      As the years have passed, it has become the acceptable consensus for all of your personal information, thoughts, and opinions, to become freely available to anyone, at anytime, for any reason in order for companies to profit from it.

      People keep believing this is normal and companies keep taking more. Unless everyone is willing to stand firm and say enough, I only see it declining further, unfortunately.

    • taladar@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      We should have the right to not have our data harvested by default.

      I would maybe not go quite that far but at the very least this should apply to commercial interests and living people.

      I think there are some causes where it should be acceptable to have your data usable by default, e.g. statistical analysis of health threats (think those studies about the danger of living near a coal power plant or similar things).

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        I disagree. Yes, there are benefits to a lot of invasions of privacy, but that doesn’t make it okay. If an entity wants my information, they can ask me for it.

        One potential exception is for dead people, I think it makes sense for a of information to be released on death and preventing that should be opt in by the estate/survivors, depending on the will.

        • taladar@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          But they literally can’t ask you for it if it is about high volumes of data that only become useful if you have all or close to all of it like statistical analysis of rare events. It would be prohibitively expensive if you had to ask hundreds of thousands of people just to figure out that there is an increase in e.g. cancer or some lung disease near coal power plants.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            They don’t need most of the date, they need a statistically significant sample to have a high confidence in the result. And that’s a small percentage of the total population.

            And you could have something on file where you opt in to such things, just like you can opt in to being an organ donor. Maybe make it opt out if numbers are important. But it cannot be publicly available without a way to say no.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        That implies that having data be harvested for companies to make profits should be the default.

        I sure hope those studies are not being done by for profit companies!

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      We should have the right to not have our data harvested by default.

      How would that benefit the average person?

      • FourWaveforms@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 hours ago

        By giving us the choice of whether someone else should profit by our data.

        Same as I don’t want someone looking over my shoulder and copying off my test answers.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          By giving us the choice of whether someone else should profit by our data.

          What benefit do you expect from that?

          Same as I don’t want someone looking over my shoulder and copying off my test answers.

          Why not?

          • FourWaveforms@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 hour ago

            I prefer that the benefits of those things accrue to me, or to others, or to no one, in accordance with my choice.

            In this way, I would decide who gains the economic or social benefits of these activities of mine; and I also, in the case of personal data, would decide who gets to make my business, their business.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 days ago

        Send me your name, birthdate, web browsing history, online spending history, real time location, and a list of people you know and I will explain it to you.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      Remind me in 3 days.

      Although poison pills are only so effective since it’s a cat and mouse game, and they only really work for a specific version of a model, with other models working around it.

    • Loduz_247@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      But do Glaze, Nightshade, and HarmonyCloak really work to prevent that information from being used? Because at first, it may be effective. But then they’ll find ways around those barriers, and that software will have to be updated, but only the one with the most money will win.

        • Loduz_247@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          AI has been around for many years, dating back to the 1960s. It’s had its AI winters and AI summers, but now it seems we’re in an AI spring.

          But the amount of poisoned data is minuscule compared to the data that isn’t poisoned. As for data, what data are we referring to: everything in general or just data that a human can understand?

    • Zenith@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      I’ve deleted pretty much all social media, I’m down to only Lemmy. I only use my home PC for gaming, like CiV or cities skylines or search engines for things like travel plans. I’m trying to be as offline as possible because I don’t believe there’s any other way to opt out and I don’t believe there ever will be. Like opting out of the internet is practically impossible, AI will get to this point as well

  • KeenFlame@feddit.nu
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 day ago

    Ah yes. The “freedom” the usa has spread all over its country and other nations… Yes of course we must protect that freedom that is ofc a freedom for people to avoid getting owned by giant corporations. We must protect the freedom of giant corporations to not give us ai if they want to. I don’t disagree but think people are more important

  • Oxysis/Oxy
    link
    fedilink
    English
    arrow-up
    14
    ·
    2 days ago

    Is it really though? I haven’t touched it since the very early days of slop ai. That was before I learned of how awful it is to real people

  • RvTV95XBeo@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    If AI is going to be crammed down our throats can we at least be able to hold it (aka the companies pushing it) liable for providing blatantly false information? At least then they’d have incentive to provide accurate information instead of just authoritative information.

    • Womble@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      12 hours ago

      As much as you can hold a computer manufacturer responsible for buggy software.

  • fxdave@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 days ago

    The problem is not the tool. It’s the inability to use the tool without a third party provider.

  • smarttech@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    AI is everywhere now, but having the choice to opt out matters. Sometimes, using tools lik Instant Ink isn’t about AI it’s just about saving time and making printing easier.

    • NotASharkInAManSuit@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Yes. That is actually an ideal function of ethical AI. I’m not against AI in regards to things that is is actually beneficial towards and where it can be used as a tool for understanding, I just don’t like it being used as a thief’s tool pretending to be a paintbrush or a typewriter. There are good and ethical uses for AI, art is not one of them.