YouTube’s recommendation system exhibits left-leaning bias, new study suggests

YouTube recommendation algorithm has a tendency to influence users to move away from extreme right-wing political content more quickly than it does from extreme left-wing political content, according to new research published in PNAS Nexus.

“YouTube, with more than two billion monthly active users, significantly influences the realm of online political video consumption,” explained study author Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.

“In the United States, for example, a quarter of adults regularly engage with political content on the platform. Approximately 75% of YouTube video views result from its recommendation algorithm, highlighting the algorithm’s potential ability to foster echo chambers and disseminate extremist content, a matter of paramount concern. This motivated us to design an experiment to better understand how the algorithm influences what people see on the platform.”

The researchers conducted their study by employing 360 bots to simulate YouTube users. They created new Google and YouTube accounts for each bot to isolate the “personalization” process driven by YouTube’s recommendation algorithm.

Initially, the researchers collected the top 20 recommended videos from the YouTube homepage for new accounts without any watch history. They analyzed the distribution of recommended video categories and political classes, finding that “News & Politics” videos, when recommended, were primarily Center and Left-leaning.

Next, each user (bot) watched 30 videos matching their designated political group (e.g., Far Left, Left, etc.), recording the recommended videos after each watch. The analysis showed that recommendations largely aligned with the user’s political classification, and the speed at which recommendations adapted to the user’s preferences varied.

Users who completed the first stage then watched 30 videos from a different political classification, aiming to see how quickly they could “escape” their original class and “enter” the new political class. The results indicated an asymmetry in the escape speed, suggesting a skew towards left-leaning content in YouTube’s recommendations.

Finally, the users watched the top recommended video on their homepage and collected recommendations after each video. This stage explored transitions in recommendations and found that the algorithm tended to return users to the centrist content and away from political extremes.

Overall, the findings indicated that YouTube’s recommendation algorithm exhibited a left-leaning bias in the distribution of recommendations, even when controlling for the number of videos in different political classes. The algorithm made it easier for users to enter left-leaning political personas and escape from right-leaning ones.

In other words, when a user starts watching Far-Right political content on YouTube, the recommendation algorithm is more effective at suggesting and promoting content that is less extreme, which could include more moderate right-wing content or even centrist content. As a result, users who initially engage with Far-Right content may find themselves exposed to a broader range of political perspectives relatively quickly.

On the other hand, when a user starts watching Far-Left political content on YouTube, the algorithm is somewhat slower in guiding them away from extreme left-wing content. It takes users more time to transition from Far-Left content to less extreme content compared to the Far-Right scenario.

“Our research highlights that YouTube’s recommendation algorithm exerts a moderating influence on users, drawing them away from political extremes. However, this influence is not evenly balanced; it’s more effective in steering users away from Far Right content than from Far Left content. Additionally, our findings reveal that the algorithm’s recommendations lean leftward, even in the absence of a user’s prior viewing history.”

The research highlights how YouTube’s recommendation algorithm can influence users’ political content consumption.

“Research on algorithmic bias has attracted significant attention in recent years, and a number of solutions have been proposed to expose and address any such biases in today’s systems. Given this, we were surprised to find that the recommendation algorithm of YouTube, one of the most popular platforms, still exhibits a left-leaning political bias. These findings prompt inquiries into the appropriateness of political biases in recommendation algorithms on social media platforms, and the significant societal and political consequences that may arise as a result.”

But the study, like all research, includes some caveats. The study was conducted over a specific period, and YouTube’s algorithms and content landscape may evolve over time. Thus, the findings may not fully capture the platform’s current state or its future developments.

Additionally, the research primarily focused on the U.S. political context. As a result, the findings may not be directly applicable to other countries or political landscapes. The dynamics of YouTube’s recommendation algorithm and its impact on political content consumption could vary significantly in different cultural and political settings.

“Our study focused primarily on U.S. politics, and more research is needed to determine the degree of which these findings hold outside of the United States.”

The study, “YouTube’s recommendation algorithm is left-leaning in the United States“, was authored by Hazem Ibrahim, Nouar AlDahoul, Sangjin Lee, Talal Rahwan, and Yasir Zaki.

© PsyPost