Facebook unveils new policy for fighting 'social harm'

Facebook has a new policy for fighting “co-ordinated social harm”.

The social media giant has put a new policy in place which allows it to put a stop to networks of accounts taking part in what it describes as “co-ordinated social harm”.

Co-ordinated social harm gives the company a framework to address harmful actions that come from legitimate Facebook accounts, unlike the current policy for “co-ordinated inauthentic behaviour”, which only targets networks of fake accounts.

The company said the new policy would help the platform fight harmful behaviour it wouldn’t otherwise be able to fully address under its existing rules.

Facebook’s head of security policy Nathaniel Gleicher said in a conference call with reporters: “We are seeing groups that pose a risk of significant social harm, that also engage in violations on our platform, but don't necessarily rise to the level for either of those where we’d enforce against for inauthenticity under CIB [coordinated inauthentic behaviour] or under our dangerous organisations policy.

“So this protocol is designed to capture these groups that are sort of in between spaces.”

Facebook said it could take “a range of actions” in enforcing its new rules around co-ordinated social harm, including banning accounts or putting limits on their reach to prevent their content from spreading across the platform.

During the press call, Gleicher said that the “work on this policy started well before January 6”, but noted recent events have informed the company’s decision making.

He added: “If you think about our enforcement against QAnon-related actors, if you think about our enforcement against ‘Stop the Steal,’ if you think about our enforcement against other groups - we learned from all of them.”

© BANG Media International