Study sheds light on how deplatforming after the Jan. 6 riot impacted the larger online ecosystem

In a new study published in the Journal of Quantitative Description: Digital Media, researchers examined the effects of deplatforming on online discourse and alternative social media platforms. The study focused on the aftermath of the January 6 Capitol riot and analyzed the responses of users who expected to be deplatformed from mainstream social media sites such as Twitter and Facebook.

“In my prior work on content moderation and these kinds of interventions, heavy-handed bans can improve the specific platform, but how these interventions impact the larger ecosystem is an open question that needs an answer if we are going to improve the online information space,” said study author Cody Buntain, an assistant professor at the University of Maryland.

“It is possible that suppressing a particular audience segment might improve Twitter for the remaining population, which would be a good outcome, but large-scale de-platforming might increase polarization or push audiences to more extreme spaces where these audiences and their messages are more welcome.”

“If the latter is true, and de-platforming makes the overall information space worse, that finding has important implications for how we govern the broader information space, especially when we know these extreme spaces are generally quite hateful, racist, and toxic platforms.”

“Prior work in this space, by Eshwar Chandrasekharan et al., on Reddit shows such bans to be useful in the specific platform, but more recent work by Ribeiro et al. find similar deplatforming doesn’t suppress negative behavior in the broader space.”

The study analyzed data from several sources to examine how users who expected to be deplatformed moved their communication to alternative spaces after the Great Deplatforming. The researchers used Brandwatch, Google Trends, and Meta’s Crowdtangle tool to describe time trends in social media users’ interest in alternative platforms.

They found that Parler experienced a growth in interest around the time of the U.S. presidential election in November 2020, with a dramatic spike in January. However, the sharing of links virtually ceased once the Parler.com website was taken offline in the wake of the storming of the U.S. Capitol.

The study also found that the communities directly engaged in discussions supportive of the January 6 events responded to platform actions in strategic ways, providing signposts to where the discussion could be continued.

“One major takeaway for the average person is that, when you are on Twitter or Facebook or some mainstream platform, and you see someone sharing links to their profiles on other platforms, you should be hesitant to go join that space, especially if you are unfamiliar with it,” Buntain told PsyPost.

“In the political context, these non-mainstream spaces are often worse places than the mainstream platform. While this takeaway is not always true (e.g., the movement to Mastodon following Musk’s acquisition of Twitter), one should be careful in the platforms they visit following content moderation interventions.”

Many users who anticipated being deplatformed turned Gab. Gab gained significant engagement across multiple spaces after January 6, compared to other alternative platforms. However, with the flood of new users, Gab became much more toxic, with hate speech rising to levels much higher than in previous months.

“I was surprised to see how consistent the increase in engagement with Gab, as a platform, was across the three ways we measured (Twitter, Reddit, and Google Trends),” Buntain said. “This result suggests a broad push to this platform after the more mainstream spaces made it clear voter-fraud-style discussions weren’t welcome.”

The study also found that hate speech spiked dramatically on Twitter in the week of January 6, 2021, compared to the month before, particularly anti-Black hate speech. A similar pattern was observed on Reddit. While overall hate speech on these platforms eventually returned to baseline levels, many specific categories remained elevated on Twitter.

These findings suggest that deplatforming can have complex effects. In prior work, one of the authors argued that deplatforming can increase the prevalence and distribution of content by individuals who have been deplatformed. The current study’s results are consistent with a displacement-diffusion dynamic, in which deplatforming encourages a shift to niche platforms, leading to a negative shift in the tone of content on those platforms but no significant change in expression on mainstream platforms.

“My research here should not be construed as saying that content moderation is wrong,” Buntain explained. “Rather, we need a better understanding of how and when content moderation and these kinds of interventions work and what their unintended consequences might be.”

“Relatedly, when YouTube announced they would remove potentially harmful content from recommendation but allow it to remain on the platform, engagement with these videos decreased on Twitter and Reddit (see my paper on this from 2021), which is likely a good outcome. That said, subtle ‘shadow banning’ can look similar to that de-recommendation and is problematic.”

The study, “Cross-Platform Reactions to the Post-January 6 Deplatforming“, was authored by Cody Buntain, Martin Innes, Tamar Mitts, and Jacob Shapiro.

© PsyPost