close
close

Most comments deleted from social media platforms in Germany, France and Sweden are lawful speech – why this should raise concerns about freedom of expression online

In the era of ubiquitous social media, the power to shape public discourse lies in the hands of a few digital giants. But recent European laws aimed at curbing “torrents of hate” online could stifle free speech. As policymakers tout these measures as necessary for a safer Internet, a critical question arises: Is legally permissible speech being removed from social media platforms?

A growing regulatory network in Europe: Digital Services Act and NetzDG

European digital laws, in particular the German Network Law Enforcement Act (NetzDG) ​​and the EU Digital Services Act (DSA), have been developed to tackle the spread of illegal content online. The NetzDG law, passed in 2017, required social media platforms to immediately remove illegal content such as defamation and hate speech or face financial penalties. This Act is now being repealed and replaced by the DSA.

In 2018, President Emmanuel Macron warned against “torrents of hate flowing over the Internet.” European Union Commissioner and digital enforcer Thierry Breton assured in 2020 that “the Internet cannot remain the Wild West.” The DSA, which became fully effective in February 2024, aims to ensure a “safe, predictable and trustworthy online environment.” In 2023, both Breton and Macron raised the possibility of using DSA to shut down social media platforms during periods of social unrest. Fortunately, civil society organizations quickly rejected this suggestion.

Just this month, EU President Ursula von Leyen warned that the “foundational tenets of our democracy” were at risk when she revealed plans to establish a “European Democracy Shield” to counter disinformation and foreign interference online. This would undoubtedly expand the DSA’s powers to regulate broader forms of speech online.

Transforming DSA into a tool for broader regulation of online speech, including the threat of wholesale shutdowns, requires civil society to critically assess and examine the rationale behind these regulations and their impact on online discourse.

Content takedown study in France, Germany and Sweden

In a new report published by The Future of Free Speech, we sought to determine whether the basic premise of these laws is true: are social media platforms overrun with illegal content? And if so, how do platforms and users moderate this content in response to existing digital regulations?

According to our report, a staggering majority of content removed from platforms like Facebook and YouTube in France, Germany and Sweden was legally permissible. For example, our study examined deleted comments from 60 of the largest Facebook pages and YouTube channels in France, Germany and Sweden and found that, depending on the platform and country, between 87.5% and 99.7% of the deleted content was legal.

Although the DSA was not in full force during the period of our analysis, every country we examined had laws in place to define illegal speech, with Germany’s NetzDG having the greatest impact on content moderation. Our findings show that the impact of the NetzDG is particularly pronounced in Germany, where 99.7% of deleted comments on Facebook and 98.9% on YouTube were legally permissible. This suggests that platforms are being overly cautious, likely due to the high penalties imposed for non-compliance.

While the study was unable to determine whether deleted comments were removed by platforms, channel administrators, or users, reports published by Meta suggest that a high percentage of content moderation efforts occur before a user reports content. For example, in its Community Standards Enforcement reports from January to March 2024, Meta reported that of the violative content it took action on for hate speech, almost 95% was detected by the company, and just 5% % were reported by users.

If companies, sites and channels overcorrect in response to sweeping digital regulations, removing legally permitted content to comply and avoid excessive fines, this could have a serious impact on free speech online.

The cost of overzealous moderation

The consequences of this excessive removal are far-reaching. Removing legally permissible speech to comply with digital regulations undermines the fundamental right to freedom of expression and undermines public trust in social media platforms as places for open discussion. For example, the general expressions of opinion that make up the majority of deleted comments often do not violate any rules or community standards. These statements did not contain linguistic attacks, hate speech or illegal content, such as expressing support for a controversial candidate. However, we found that over 56% of deleted comments fall into this category. This highlights a disturbing trend in which platforms are sacrificing free speech on the altar of excessive caution.

Excessive moderation may result from attempts to avoid excessive penalties under current laws, significantly increase the scope of platforms’ own hate speech policies, or cultural pressures from civil society and the media. Platforms may also instinctively adopt strict moderation policies to protect their own reputations or avoid being associated with controversial content.

Whether site administrators, channels, or the platforms themselves remove content, our findings show that these factors have a chilling effect on these sites, further reducing the diversity of viewpoints necessary for a vibrant democratic society.

European policymakers say social media platforms are flooded with illegal hate speech to justify the need for drastic online regulations like DSA. Our report provides empirical evidence showing that only a small percentage of comments removed from platforms are illegal. These findings highlight the urgent need for greater emphasis on protecting freedom of expression and access to information in digital regulations and content moderation policies.

Respect for fundamental rights to freedom of expression

Policymakers and content moderators must recognize that overly stringent regulations can backfire by stifling legal speech and undermining the principles of free speech. Such restrictive policies can also lead to the removal of content about the minority voices they are intended to protect.

Instead, we must create a digital environment where diverse perspectives can exist, while implementing measures to moderate truly harmful content without restricting legitimate political discussion, even when it contains controversial or offensive ideas. Regulators and platforms should refine and narrow their content removal criteria to ensure more precise targeting of truly harmful content.

To its credit, the DSA requires platforms to provide clear content moderation guidelines and more robust appeals procedures for users. This could help alleviate the chilling effect and restore public trust in social media as a space for free and open dialogue. European policymakers should reassess the impact of current digital regulations and recognize unintended consequences that may occur once the DSA comes into force.

The findings in this report are a stark reminder that in moving towards a safer online environment, we must not lose sight of the fundamental human rights to freedom of expression. While platforms are not subject to international human rights law, policymakers, platforms and civil society should ensure that content moderation policies protect users without silencing the voices that make our democracies strong.