Advertisement

Stanford Seminar - Algorithmic Extremism: Examining YouTube's Rabbit Hole of Radicalization

Stanford Seminar - Algorithmic Extremism: Examining YouTube's Rabbit Hole of Radicalization Mark Ledwich

January 8, 2020
YouTube's recommendation algorithm is frequently characterized by journalists and researchers as radicalizing users to the far-right, but the evidence to date has been weak. We used data collected from the YouTube website to analyze the balance in recommendation impressions to see if it is favoring more extreme content. 768 US political channels were categorized into culturally relevant orientations and sub-cultures and 23M recommendations for recent videos were recorded during November-December 2019. We found that the late 2019 recommendation algorithm actively discourages viewers from being presented with fringe content. The algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with a slant towards partisan political channels like Fox News and Last Week Tonight.

View the full playlist:

Stanford Online,seminar,EE380,Mark Ledwich,recommendation algorithm,algorithms,

Post a Comment

0 Comments