Academic: Social media needs rules | Opinion

Social media platforms have imploded with misinformation and hate. Linda Stamato makes a case for some sort of regulation and a restoration of trust in digital content for these platforms to truly function as public squares.

By Linda Stamato

“The rise of social media has “unwittingly dissolved the mortar of trust, belief in institutions, and shared stories that had held a large and diverse secular democracy together.” ― Jonathan Haidt, The Atlantic, May, 2022.

How is it possible, then, to imagine a social media platform functioning as the digital equivalent of the public square?

The platforms do provide spaces for people to achieve much that can be positive, but, with their destructive influence growing, social media platforms continue to fall far short as digital public squares.

Reforms and regulations are sorely needed. And they may get a prod from the U.S. Supreme Court as it appears poised to reconsider the rules governing online speech, not least those limiting social media’s liability.

Bringing attention to the need is none other than Elon Musk as his acquisition of major player, Twitter, implodes, with content moderators either fired or abandoning ship and suspended accounts, as extremists and promoters of misinformation and conspiracy theories test Twitter’s boundaries.

The impact of the daily focus on Musk wreaks havoc beyond Twitter, however, creating an opportunity to take a serious look at social media platforms — all of them. It’s about time!

YouTube subculture, for example, with its racist, anti-gay and anti-Semitic messaging has radicalized far-right extremists around the globe. With millions of followers pushing to Twitter and other sites, high-profile extremists revel in anti-gay venom and vile anti-Semitic tropes.

And then there is Gab, the extremist, far-right social media platform that specializes in COVID misinformation while trafficking in white supremacy and various conspiracy theories that it does not even attempt to monitor.

No list of social media’s big players would be complete without Facebook, Instagram and Google. Sites, such as Google’s Play Store, offer space to voices that spread hate and disinformation that fuel, and are fed by, political discord.

False tweets by impersonators and “shares” of these messages can do damage to corporate brands, too (e.g. Eli Lilly,). No wonder advertisers — including the pharmaceutical companies that previously plowed cash into Twitter — are in full flight from the platform.

Bad actors, having virtually free rein on these platforms, including racists and your everyday hate-mongers, post and forward their bile. The malevolent attacks on LGBTQ communities, moreover, reveal the concentration of “dark world ideology” in the spaces where those inclined to commit acts of violence dwell, troll and lure others.


Jonathan Haidt, the noted social psychologist, and others have captured attention as they issued research reports and observations on the dangers of social media. (A valuable collaborative review continues to provide access to additional research and multiple criticisms as well.)

Conclusions, in short, are these: Social media platforms amplify political polarization, foment populism — especially right-wing populism — and are associated with the spread of misinformation and acts connected to violence.

The impact of social media use on the mental health of its users, particularly adolescents and young adults, is striking. Researchers were able to establish a significant link, for example, between the presence of Facebook and the deterioration in mental health among college students.

As more people come to believe the world is full of fake news, moreover, trust, an essential element in successful democracies, wanes. And as other critical elements of our national well-being unravel, faith in the nation’s institutions; support for its pluralistic values threatened by anti-immigrant vitriol and conspiracies; revisions of history to privilege some citizens and damage others; and an erosion of belief in the benefit of diversity to the nation’s social and economic good — America itself is under siege, by enemies within.

The struggle of print newspapers to attract and retain readers as their presence shrinks and the increasing fragmentation of news in our digital era make matters worse.

Given the scope and quality of the research on social media platforms that we have now — the distortions of truth that anonymous voices post, forward, tweet and re-tweet to maximize the damage —and acknowledging the fact that the platforms are largely unconstrained by commitments to protect the public interest, we can’t fail to act.

So what do we do? Regulating state by state, nation by nation, when the challenges facing us are national and global, might seem to make little sense, but regulation within smaller units can have broad impact, much as, say, California’s emission requirements for cars, that, being higher than any state or federal standard, prompted automakers to produce vehicles to meet the higher standard.

A proposal, in Regulatory Review, to combine government regulation with pressure on companies to self-regulate is a sound and savvy approach that includes both incentives and liabilities:

Vest enforcement power over the “terms of service” of each social media company in the FCC, the FTC and the SEC, and force adherence by the government through litigation that involves fines against not only the companies but also against their CEOs, other executives and members of their boards of directors;

  • Revise Section 230 of the Communications Decency Act of 1996 to hold the online platforms accountable for profits they earn from deliberately disseminating harmful content, making companies open to civil lawsuits or specific FCC, FTC, SEC, or Justice Department actions that force them to identify profits gained from advertisements tied to these viral posts, generating fines that can’t be easily absorbed by the wealthy companies.
  • Regulate to incentivize self-regulation. Fears of intrusive government regulation can prompt companies to regulate their own operations as, for example, the threat of government intervention prompted coalitions of movie and video game firms to establish content standards and codes of conduct.

Public Interest Technology — technology used to serve the public good — given its scope, content and funding, could prove consequential. The Ford Foundation, along with partners, is making a substantial investment in this work; it aims to deliver an impact on society that parallels, if not exceeds, that of public interest law. Public interest technology could reduce the damage generated by social media platforms and create viable alternatives.

In conclusion, we need virtual platforms to function more like physical public squares. With citizens gathering to identify social problems and to influence political action on social media, the media platforms can become a vital part of democracy.

Society will benefit, moreover, from digital public space when we manage, with the cooperation of social media, to harness the technology to serve the constructive ends of institutions, agencies, organizations, governments and individuals. We may come to see, then, what society and the platforms also require: user trust in digital content.

Linda Stamato is a senior policy fellow at the Edward J. Bloustein School of Planning and Public Policy at Rutgers University in New Brunswick.

Here’s how to submit an op-ed or Letter to the Editor. Bookmark Follow us on Twitter @NJ_Opinion and on Facebook at Opinion. Get the latest news updates right in your inbox. Subscribe to’s newsletters.

Follow us on Twitter @NJ_Opinion and on Facebook at Opinion.

© Advance Local Media LLC.