How the EU Digital Services Act (DSA) Affects Online Free Speech in 2025

The EU Digital Services Act (DSA) and its impact on online free speech in 2025.

Nicknamed the ‘Digital Surveillance Act’, the EU’s key online platform legislation hits its one-year mark in February 2025

Picture of Dr. Adina Portaru
Dr. Adina Portaru

Senior Counsel, Europe, ADF International

The EU Digital Services Act (DSA) and its impact on online free speech in 2025.

The EU Digital Services Act (DSA), which took effect last February, has been hailed as a landmark law designed to bring order to the digital world. Yet, beneath the surface of supposedly protecting democracy, lies a framework fraught with overreach, ambiguity, and the erosion of fundamental freedoms.

The EU Commission claims that the Digital Services Act is needed to “protect democracy” by tackling so-called “misinformation”, “disinformation” and “hate speech” online. It promises to create a safer online space by holding digital platforms—particularly “Very Large Online Platforms” (VLOPs) such as Google, Amazon, Meta and X—accountable for addressing these terms.

However, its implementation raises grave concerns. By mandating the removal of broadly defined “harmful” content, this legislation sets the stage for widespread censorship, curtailing lawful and truthful speech under the guise of compliance and safety. The result will be a sanitized and tightly controlled internet where the free exchange of ideas is stifled.

Ultimately, the EU Digital Services Act will allow the silencing of views online that are disfavoured by those in power.

Freedom of speech is the cornerstone of a democratic society and includes the right to voice unpopular or controversial opinions. For this reason, ADF International is committed to ensuring that the right to freedom of speech is firmly upheld.

The Implications on Free Speech

The Digital Services Act’s regulatory framework has profound implications for free speech. 

Under the DSA, tech platforms must act against “illegal content”, removing or blocking access to such material within a certain timeframe. However, the definition of “illegal content” is notably broad, encompassing vague terms like “hate speech”—a major part of the DSA’s focus.

The DSA relies on the EU Framework Decision of 28 November 2008, which defines “hate speech” as incitement to violence or hatred against a protected group of persons or a member of such a group. This circular definition of “hate speech” as incitement to hatred is problematic because it fails to specify what “hate” entails. 

Due to their vague and subjective nature, “hate speech” laws lead to inconsistent interpretation and enforcement, relying more on individual perception rather than clear, objective harm. Furthermore, the lack of a uniform definition at the EU level means that what is considered “illegal” in one country might be legal in another.

Given all this, tech platforms face the impossible task of enforcing uniform standards across the EU.

The effects of the DSA will not be confined to Europe. There are legitimate worries that the DSA could censor the speech of citizens worldwide, as tech companies may impose stricter content regulations globally to comply with European requirements.

Big Tech Platforms

Tech platforms aren’t just removing clear violations—they’ve also started removing speech that could be flagged as “harmful”. If you post a political opinion or share a tweet that some might find offensive, it might get flagged by an algorithm. To avoid massive fines or penalties, platforms will err on the side of caution and remove your post, even if it’s perfectly lawful.

Platforms rely on the automated removal of “harmful” information. These tools are widely known to be inaccurate, often fail to consider context, and therefore flag important and legal content. And if it’s not the algorithms that flag your content, it may be regular users who disagree with what you’re saying.

Alleged “Hate Speech” Case

There are many instances in which “hate speech” laws have targeted individuals for peacefully expressing their views online, even before the DSA came into effect. ADF International is supporting the legal defence of Päivi Räsänen, a Finnish Parliamentarian and grandmother of 12, who stands criminally charged for “hate speech”.

Päivi shared her faith-based views on marriage and sexual ethics in a 2019 tweet, a radio show, and in a 2004 pamphlet that she wrote for her church, centred on the Biblical text “male and female he created them”.

Päivi endured two trials and years of public scrutiny before she was unanimously acquitted of “hate speech” charges by both the Helsinki District Court and the Court of Appeal. Despite her acquittal, the state prosecutor has appealed the case, taking it to the Finnish Supreme Court.

It’s obvious that these laws aren’t only about combatting hate and violence; rather, they may target any speech deemed controversial or that challenge the status quo.

YouTube

By loading the video, you agree to YouTube's privacy policy.
Learn more

Load video

Penalties for Non-Compliance with the EU Digital Services Act

The penalties for failing to comply with the EU Digital Services Act are severe.

Non-compliant platforms with more than 45 million active users could be fined up to 6% of their global annual turnover. For tech platforms like Google, Amazon, Meta, and X, this means billions of euros. So, even the biggest tech companies can’t afford to fall short of the DSA regulations.

If a platform repeatedly fails to comply with the DSA, the EU Commission can impose a temporary or permanent ban, which could result in the platform’s exclusion from the EU market entirely. For platforms that rely heavily on this market, this would mean losing access to one of the world’s largest digital markets.

The risks are high, and tech platforms will scramble to ensure they comply—sometimes at the expense of your fundamental right to free speech.

“Shadow Content Banning”

In the digital age, we rely increasingly on digital technology to impart and receive information. And it’s essential that the free flow of information is not controlled by unaccountable gatekeepers policing what can and cannot be said.

ADF International’s stance is clear: this legislation will result in dangerous overreach that threatens the very freedoms it claims to protect.

In January, our legal team attended a plenary session and debate at the EU Parliament in Strasbourg regarding the enforcement of the DSA. The discussion brought to light significant concerns across the political spectrum about how the DSA may impact freedom of speech and expression, and rightfully so.

EU Parliament

Several members of the EU Parliament (MEPs), who initially favoured the legislation, raised serious objections to the DSA, with some calling for its revision or annulment. A significant point of contention was the potential for what they termed “shadow content banning”—removing content without adequate transparency.

This includes cases where users might be unaware of why their content was banned, on what legal basis, or how they can appeal such decisions. Most of the time, they’re left with nothing but a generic AI response and no explanation. 

Some MEPs, like French MEP Virginie Joron, referred to the DSA as the “Digital Surveillance Act”.

Despite intense opposition, the EU Commission representative and the Council of the EU representative promised to enforce the DSA more rigorously. They vowed to double down on free speech by enforcing more thorough fact-checking and anti “hate speech” laws “so that “hate speech” is flagged and assessed [within] 24 hours and removed when necessary”.

They failed to provide comprehensive responses to the concerns raised about the DSA’s potential to erode fundamental rights, leaving critical questions about its implementation and implications unresolved.

Conclusion: EU Digital Services Act or “Digital Surveillance Act”?

The EU Digital Services Act’s enforcement mechanisms are riddled with ambiguity. Terms like “misinformation,” “disinformation,” and “hate speech” are too wide and vague to serve as a proper basis for silencing speech. These terms are too easily weaponized, enabling those in power to police dialogue and suppress dissent in the name of safety.

By placing excessive pressure on platforms to moderate content, the DSA risks creating an internet governed by fear—fear of fines, fear of bans, and fear of expressing one’s views. If the DSA is allowed to stifle open dialogue and suppress legitimate debate, it will undermine the very democratic principles it claims to protect.

Policymakers must revisit this legislation, ensuring that efforts to regulate the digital sphere do not come at the cost of fundamental freedoms.

Europe’s commitment to freedom of speech demands better. Through our office in Brussels, we at ADF International are challenging this legislation because it’s not up to governments or unaccountable bureaucrats to impose a narrow view of acceptable speech on society.