YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

  • dexa_scantron@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    You might be in a different test group. They always have a few different groups with different settings to check how well the algorithm is meeting their goals. That’s how they know they make less money if they don’t radicalize people.