Author Topic: YouTube Algorithms Don't Turn Unsuspecting Masses Into Extremists, New Study Suggests  (Read 62 times)

0 Members and 1 Guest are viewing this topic.

Offline Kamaji

  • Hero Member
  • *****
  • Posts: 58,117
YouTube Algorithms Don't Turn Unsuspecting Masses Into Extremists, New Study Suggests

A new study casts doubt on the most prominent theories about extremism-by-algorithm.

LIZ WOLFE
4.26.2022

"Over years of reporting on internet culture, I've heard countless versions of [this] story: an aimless young man—usually white, frequently interested in video games—visits YouTube looking for direction or distraction and is seduced by a community of far-right creators," wrote Kevin Roose for The New York Times back in 2019. "Some young men discover far-right videos by accident, while others seek them out. Some travel all the way to neo-Nazism, while others stop at milder forms of bigotry."

Never one to dial back alarmism, The Daily Beast put out a headline in 2018 calling YouTube's algorithm a "far-right radicalization factory" and claimed that an "unofficial network of fringe channels is pulling YouTubers down the rabbit hole of extremism." Even MIT Technology Review sounded the alarm in 2020 about how "YouTube's algorithm seems to be funneling people to alt-right videos."

A new study by City University of New York's Annie Y. Chen, Dartmouth's Brendan Nyhan, University of Exeter's Jason Reifler, Stanford's Ronald E. Robertson, and Northeastern's Christo Wilson complicates these popular narratives. "Using paired behavioral and survey data provided by participants recruited from a representative sample (n=1,181), we show that exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment," write the researchers. "These viewers typically subscribe to these channels (causing YouTube to recommend their videos more often) and often follow external links to them. Contrary to the 'rabbit holes' narrative, non-subscribers are rarely recommended videos from alternative and extremist channels and seldom follow such recommendations when offered."

*  *  *

Basically, the narrative that hordes of unwitting YouTube browsers are suddenly stumbling across far-right extremist content and becoming entranced by it does not hold much water.

*  *  *

Source:  https://reason.com/2022/04/26/youtube-algorithms-dont-turn-unsuspecting-masses-into-extremists-new-study-suggests/