• HandsHurtLoL
    link
    fedilink
    19
    edit-2
    11 months ago

    The reason left wingers don’t have the presence the right does on Facebook, YouTube, etc isn’t because of a lack of voices or audiences - it’s because of deliberate manipulation of what is put in front of people.

    I recently have gotten into wasting tons of hours on YouTube shorts, and I was very surprised that after a grand total of maybe 12 hours of using the platform, Andrew Tate content was just shoehorned into the algorithm of shorts being presented to me. Up to this point in time I was watching cosmetics, baking cookies, comedy, cooking, just funny hot takes, but then completely out of the blue one day that guy’s ugly ass monkey face was on my phone, and even though it was so quick that I couldn’t even think of his name, my lizard braid already recognized that he is very dangerous to women, so I opened the menu to select the feature on YouTube that prevents those channels from being promoted to me ever again.

    There is 0% chance that the content that I had previously been watching links up in the algorithm to Ben Shapiro, Jordan Peterson, Joe Rogan, or Andrew Tate. This leads me to believe that YouTube intentionally carves out space for these content creators and makes promises about getting their content in front of everybody’s eyeballs, regardless of level of interest in that type of content.

    • @LostMyRedditLogin@lemmy.world
      link
      fedilink
      9
      edit-2
      11 months ago

      Once an algorithm is understood it can be manipulated. Russia and the right-wing figured out how to tie Trump videos to other popular videos to recommend it to other unsuspecting users. I get recommended right-wing videos while watching unrelated videos. It’s algorithm manipulation. It’s an endless battle for Google.

    • Rashnet
      link
      fedilink
      411 months ago

      I watch a lot of YouTube content and almost none of it is political either way. Mostly it’s how-to videos, science videos, or music videos. I’ve noticed two things lately concerning alt-right recommendations - First is, I get alt-right videos recommended when I visit a creator who in addition to their “normal” content has one or more videos that espouse right wing commentary. I recently fell into a rabbit hole of watching reaction videos of people reacting to a band I like and watched a reaction from a channel called LFR Family. Unknown to me since I was just watching one video about a music reaction this channel is run by very pro-trump people and they have several political videos. After I finished the video and went back to the YT home page every other video was a right wing propaganda piece. The second way I’ve started to get the right wing recommendations is by watching several stand up comedy videos in a row.

      • HandsHurtLoL
        link
        fedilink
        111 months ago

        What’s kind of weird to me is that the very little political content I watch is ostensibly left wing.

        I will echo what another user in this thread said and assume that it’s because of standup comedy I’ve seen. However, I usually swipe through the clearly right wing comedians or people whose jokes are like bitching about wokism.

        • Rashnet
          link
          fedilink
          311 months ago

          What’s kind of weird to me is that the very little political content I watch is ostensibly left wing.

          Same for me. I assume the propagandists have found a correlation between standup and people they can influence. I don’t watch right wing stand up either in fact I purposely click don’t recommend on anything right leaning when I do see it.

          On a semi related note, I have just noticed in the last 4 days that there are a bunch of rap and black reaction channels giving a favorable reaction to that country song that was just taken off CMT. I actually took the time and watched some of them in an incognito tab and the reactions are suspiciously similar. The hundreds of comments on these videos are all the same and from users with names like suziwpgt3 or other bot type names. I can’t put into words the dismay I feel about the danger of people feeling validated by these obviously faked or paid for reviews.

          • HandsHurtLoL
            link
            fedilink
            111 months ago

            Wow, that’s on another level.

            So since starting this thread, I had to change my login account to a professional account on my phone, and then after I was done I switched back to my personal account. All of my " do not recommend settings " have completely reset. I’m seeing videos on shorts that were shown to me two months ago. I’m getting recommendations for Ben Shapiro, his fucking sister, people doing interviews with Jeffree Star and his problematic ass. I can’t believe that the algorithm totally resets just because you switch accounts! I have to start all over again trying to take the trash out.

            • Rashnet
              link
              fedilink
              111 months ago

              I think something changed in the last few days overall with YT. I have been getting recommendations for videos I watched 4 or 5 years ago and they show as unwatched.