EU Commission scrutinizes big platforms over election disinformation

Mobile apps are seen on an iPhone screen. Christoph Dernbach/dpa

The European Commission on Tuesday published guidelines for how it expects large online platforms like X, formerly Twitter, and Meta to tackle disinformation during elections this year.

The European Union's Digital Services Act (DSA) requires large platforms to manage the risk of their services harming the electoral process, among other risks.

The DSA allows the commission to fine platforms up to 6% of their global revenue if they fail to address such risks adequately.

Elections to the European Parliament are due to be held in June, and national elections are scheduled this year in Slovakia, Lithuania, Belgium, Croatia, Austria and Romania.

The new guidelines say platforms should pay particular attention to the use of artificial intelligence (AI) to create fake content.

They call on platforms to set up in-house teams to monitor local risks and to tailor their efforts to each specific election and country.

Although the DSA requires that platforms mitigate the risk of "negative effects" on elections, it does not say they have to remove disinformation.

Neither do the new guidelines. For example, they recommend "prompts and nudges urging users to read content and evaluate its accuracy and source before sharing it."

Most large platforms have signed a voluntary code of practice on disinformation, which emphasizes ensuring their systems don't promote misleading content or reward it with advertising revenue.

X withdrew from the code in May last year. The commission opened an investigation into X in December for possible breaches of the DSA.

The DSA's risk mitigation rules apply to platforms with more than 45 million users per month and took effect in August 2023.

The official profile of Platform X on the screen of a smartphone shows the white letter X on a black background. Monika Skolimowska/dpa

© Deutsche Presse-Agentur GmbH