• BeautifulMind ♾️@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 year ago

    I’ll agree that ISPs should not be in the business of policing speech, buuuut

    I really think it’s about time platforms and publishers be held responsible for content on their platforms, particularly if in their quest to monetize that content they promote antisocial outcomes like the promulgation of conspiracy theories and hate and straight-up crime

    For example, Meta is not modding down outright advertising and sales of stolen credit cards at the moment Also meta selling information with which to target voters… to foreign entities

    • skymtf
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The issue with this is holding tech companies liable for every possible infraction will mean tjay platforms like Lemmy, and mastodon can’t exist cause they could be sued out of existance

      • BeautifulMind ♾️@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        10 months ago

        The issue with this is holding tech companies liable for every possible infraction

        That concern was the basis for section 230 of the 1996 Communications Decency Act, which is in effect in the USA but is not the law in places like, say the EU. It made sense at the time, but today it is desperately out of date.

        Today we understand that in absolving platforms like Meta of their duty of care to take reasonable steps to not cause harm to their customers, their profit motive would guide them to look the other way when their platform is used to disseminate disinformation about vaccines that gets people killed, that the money would have them protecting Nazis, that algorithms intended to promote engagement would become a tool not just to advertisers but to propagandists and information warfare people.

        I’m not particularly persuaded that if in the US there is reform to section 230 of the Communications Decency act, that it would doom nonprofit social media like most of the fediverse- if you look around at all, most of it already follows a well-considered duty-of-care standard that provides its operators substantial legal protection from liability for what 3rd parties post to their platforms. Also if you consider even briefly, that is the standard in effect in much of Europe and social media still exists- it’s just less-profitable and has fewer nazis.

        • skymtf
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I feel like you can’t really change 230, you need to to instead legislatate differently. There is room for more criminal liability when things go wrong I think. But cival suits in the US can be really bogus. Like someone could likely sue a mastodon instance for turning their kid trans and win without section 230

          • BeautifulMind ♾️@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I’m with you on the legislate differently part.

            The background of Section 230©(2) is an unfortunate 1995 court ruling that held that if you moderate any content whatsoever, you should be regarded as its publisher (and therefore ought to be legally liable for whatever awful nonsense your users put on your platform). This perversely created incentive for web forum operators (and a then-fledgling social media industry) to not moderate content at all in order to gain immunity from liability- and that in turn transformed broad swathes of the social internet into an unmoderated cesspool full of Nazis and conspiracy theories and vaccine disinformation, all targeting people with inadequate critical thinking faculties to really process it responsibly.

            The intent of 230©(2) was to encourage platform operators to feel safe to moderate harmful content, but it also protects them if they don’t. The result is a wild-west, if you will, in which it’s perfectly legal for social media operators in the USA to look the other way when known-unlawful use of their platforms (like advertising stolen goods, or sex trafficking, or coordinating a coup attempt, or making porn labeled ‘underage’ searchable) goes on.

            It was probably done in good faith, but in hindsight it was naïve and carved out the American internet as a magical zone of no-responsibility.

            • skymtf
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              This is not really what 230 does, sites still face criminal liability were needed, like if I made a site that had illegal content I could still be arrested and have my server seized, repealing 230 would legit just let Ken Paxton launch a multi state lawsuit suing a large list of queer mastodon instance for transing minors. Without 230 it would be lawsuit land and sites would censor anything that wasnt cat photos in an effort to avoid getting sued. Lawsuits are expensive even when you win. If you wanna make social media companies deal with something you gotta setup criminal liability not repeal 230. 230 just protect sites from cival suits not criminal ones.

        • skymtf
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I think generally we need to regulate how algrithums work of this is the case. We need actual legislation and not just law suit buttons. Also meta can slither its way out of any lawsuit, this would really only effect small mastodon instances.

    • CeeBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The problem is that your definitions are incredibly vague.

      What is a “platform” and what is a “host”?

      A host, in the definition of technology, could mean a hosting company where you would “host” a website from. If it’s a private website, how would the hosting company moderate that content?

      And that’s putting aside the legality and ethics of one private company policing not only another private company, but also one that’s a client.

      • BeautifulMind ♾️@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Fair point about hosts, I’m talking about platforms as if we held them to the standards we hold publishers to. Publishing is protected speech so long as it’s not libelous or slanderous, and the only reason we don’t hold social media platforms to that kind of standard is that they demanded (and received) complete unaccountability for what their users put on it. That seemed okay as a choice to let social media survive as a new form of online media, but the result is that for-profit social media, being the de facto public square, have all the influence they want over speech but have no responsibility to use that influence in ways that aren’t corrosive to democracy or to the public interest.

        Big social media already censor content they don’t like, I’m not calling for censorship in an environment that has none. What I’m calling for is some sort of accountability to nudge them in the direction of maybe not looking the other way when offshore troll farms and botnets spread division and disinformation