TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    edit-2
    7 hours ago

    Yeah, I don’t think I’ve ever seen alt-right nonsense without actively looking for it. Occasionally I’ll get recommended some Joe Rogan or Ben Shapiro nonsense, but that’s about it.

    I consider myself libertarian and a lot of my watch time is on Mental Outlaw (cyber security and dark web stuff), Reason (love Remy and Andrew Heaton videos), and John Stossel, but other than that, I largely about political channels. I watch a fair amount of gun content as well.

    If I get recommended political stuff, it’s usually pretty mainstream news entertainment, like CNN or Fox News. Even the crypto nonsense is pretty rare, even though I’m pretty crypto-positive (not interested in speculation though, only use as a currency and technical details).

    If you’re seeing alt-right crap, it’s probably because you’ve watched a lot of other alt-right crap.

    • Captain Aggravated@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 hours ago

      My watch history would peg me as NOT a Republican. Youtube’s short feed will serve me

      • excerpt from youtuber’s longer video
      • tiktok repost from like, the truck astrology guy or “rate yer hack, here we go” guy, etc
      • Artificial voice reading something scraped from Reddit with Sewer Jump or Minecraft playing in the background
      • Chris Boden
      • Clip from The West Wing
      • Clip from Top Gear or Jeremy Clarkson’s Farm
      • “And that’s why the Bible tells us that Jesus wants you to hate filthy fucking liberals.”

      “Do not recommend channel.” “The downvote button doesn’t even seem to be a button anymore but I clicked it anyway.” “Report video for misinformation and/or supporting terrorism.” But the algorithm keeps churning it up.

      • AngryRobot@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        Guy you replied to is trying to pretend his individual experience is representative of the whole.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          I’m not sure there is a “representative of the whole” here; I think the Youtube algorithm is modal.

          I think it’s an evolution of the old spam bots, like if you had an email address that in any way indicated you were male you’d get “v1agra” and “c1alis” ads nonstop, I’m sure you’d get makeup and breast enlargement spam or some shit in a woman’s inbox, whatever they can make you feel insecure enough to buy.

    • gdog05@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 hours ago

      I have had the opposite experience. I watch a few left-leaning commentary channels. Sam Seder, my boy Jesse Dollomore. If I watch a single video about guns (with no apparent ideological divide), within a single refresh I’m getting Shapiro and Jordan Peterson videos. I’m in a red Western state. My subscriptions are mostly mental health, tech, and woodworking. I have to delete history if I stray even a little bit.