
Given the impact of digital services on the online and offline world, states, or, in this case, a supranational union with delegated powers, are increasingly seeking to regulate this domain. We live in an age where Big Tech holds unprecedented power—the annual revenue of these giants economically places them ahead of many states’ annual budgets. The DSA is the EU’s first comprehensive and binding regulation of digital service providers in more than twenty years.
What is the Digital Services Act?
Although it purports to create “a safe online environment,” the DSA is among the most dangerous censorship regimes of the digital age.
The DSA is a legally binding regulatory framework that gives the European Commission authority to enforce “content moderation” on very large online platforms and search engines (those with more than 45 million users per month) that are established, or offer their services, in the EU.
Most of its provisions came into force in February 2024. Platforms that fail to comply with the regulation face massive financial penalties and even suspension. Through the platform’s compliance with the DSA, individuals can suffer censorship, suspension from online platforms, and criminal prosecution (under national law).
The stated objective of the DSA is “ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights enshrined in the Charter [of Fundamental Rights of the EU] are effectively protected, and innovation is facilitated”.
The Commission claims that the DSA creates “legal certainty,” “greater democratic control,” and “mitigation of systemic risks, such as manipulation or disinformation”—but, in reality, it is an authoritarian censorship regime antithetical to democracy.
Why is the DSA an extreme threat to fundamental freedoms?
The DSA requires platforms to censor “illegal content,” which it broadly defines as anything that is not in compliance with EU law or the law of any Member State (Article 3(h)). This could result in the lowest common denominator for censorship across the whole EU. Furthermore, authoritarian governments could adopt the blueprint, claiming that Western liberal states endorse it.
The DSA is deeply flawed. It is built on the idea that “bad speech” is best countered by censorship rather than robust discussion. Furthermore, the DSA gives the European Commission broad power over how platforms handle speech, which undermines the free expression essential to democratic societies.
If a censorship law such as the DSA is the “gold standard,” as the Commission has praised its own construct, authoritarian governments of the world will readily adopt the model.
Allowing “illegal content” to potentially be determined by one country’s vague and overreaching laws pits the DSA against international law standards that require any restrictions on speech to be precisely defined and necessary. This is extremely problematic given the increasing number of absurd so-called “hate speech” laws potentially criminalizing peaceful speech throughout Europe.
- Example 1: Germany’s highly controversial NetzDG Law, enacted in 2017, forces digital service providers to enforce sweeping online restrictions on certain kinds of content, linking to provisions of the criminal code and including the broad offence of “insult”. A person in Germany could see something “insulting” online that they claim is illegal under German law, file a complaint under the DSA, and trigger a take-down of the content for all countries in the EU, including countries where “insult” is not a criminal offense.
- Example 2: The DSA forces digital service providers to block specific people or messages, even those that come from outside the EU, from being heard by Europe. A Latin American president says something that a German believes violates German law. Under the DSA, that speech could be blocked (“content moderated”) from all EU countries.
How does the DSA censor speech?
The DSA is at the heart of Europe’s censorship industrial complex, consisting of a number of interwoven regulations and codes that give an unaccountable bureaucracy broad power to censor speech. Censorship occurs through vast “content moderation” networks coupled with a powerful enforcement mechanism to force platforms to comply.

“Content Moderation”
The unelected and largely unaccountable Commission has positioned itself under the DSA to enable sweeping censorship in the name of “public safety” and “democracy”. It does this through a complicated mega-structure that allows the Commission to pull the strings of censorship, making private enterprises complicit and forcing them to comply with the threat of draconian fines.
The DSA creates a censorship industrial complex consisting of an expansive web of outsourced content flaggers, national coordinators, monitoring reporters, and other authorities, with the European Commission at its head. This is a business model dependent on finding content to censor and inconsistent with the standards of the rule of law.
The structure is intentionally unnavigable for the regular citizen to determine what is allowable speech. As platforms have the obligation to moderate content, the Commission can hide behind the DSA to claim that it itself is not censoring speech.
The DSA applies directly to all Member States without requiring national implementation. National regulators work with existing legal frameworks, and they create new structures to apply the DSA alongside domestic laws. In the event of a conflict, the DSA overrides national laws.
Content is policed by so-called “trusted flaggers,” including NGOs and private entities, and may even include law enforcement agencies like Europol. This deputizes organizations with their own agendas to enforce censorship at scale.
This system of “flaggers” reports content that they deem “illegal” to the platform. The platform must prioritize flagged content for removal. If the platform deems the content illegal, it must quickly remove it or disable access (by geo-blocking or hiding visibility).
Very large platforms also are obligated to proactively prevent “illegal content” by conducting regular risk assessments to identify how their services may spread “illegal content”. Under Article 34, these include “negative effects on civic discourse and electoral processes, and public security” and “effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being”. The efforts include: adapting their design, terms and conditions, algorithmic systems, advertising, content moderation, including for “hate speech,” and awareness-raising measures.
Enforcement
A powerful enforcement mechanism ensures compliance. Under the threat of enormous financial penalties and suspension, digital service providers are forced to censor and potentially suspend individuals, and individuals may even be criminally prosecuted.
Penalties for Individual Users:
- If, after content is flagged, the platform deems it illegal after its own review, it must remove it or disable access and notify the account.
- If individuals persistently post “illegal content,” platforms can suspend their accounts (after having issued a warning and with an obligation to be proportionate and for a reasonable period of time).
- Every Member State has a designated Digital Services Coordinator to enforce compliance with the DSA. The Coordinator can seek court orders to rule on the “illegal” nature of content on platforms and then fine and potentially suspend online platforms. If a user posts content that the platform suspects violates criminal laws in so far as it is “involving a threat to the life or safety of a person or persons” (Article 18(1)), the platform is required to notify the police, triggering potential domestic prosecution.
- This could happen under one of the many over-broad “hate speech” criminal laws in Europe. If the “hate speech” was subjectively determined to threaten the life or safety of a person or persons, it is possible that even peaceful speech without a real threat could be prosecuted (e.g., if, in the case of Päivi Räsänen, someone argued that her Twitter bible post endangered those who identify as LGBT).
Penalties for Platforms
- Platforms evaluate content under the threat of crippling fines with every incentive to censor and none to uphold free speech. They face little to no punishment for unjustly banning content and enormous penalties if they refuse to censor.
- If a platform refuses to remove or restrict access to “illegal content” after it has been flagged—especially by a “trusted flagger” or regulatory authority—the platform may face serious repercussions.
- The Digital Service Coordinators have broad powers to investigate platforms, issue orders, impose fines, and escalate cases to the European Commission. When dealing with very large platforms, the Commission can override the Coordinators at any time, giving it direct control over censorship enforcement. For these platforms, the Commission has the same powers as the Coordinators but lacks the requirement of “independence” to which the Coordinators are subject. (Article 50(2)).
- The Commission or national regulators can impose fines of up to 6% of the platform’s global annual turnover for non-compliance, amounting to billions. If non-compliance persists, platforms may face periodic penalty payments. Finally, it can restrict access to the platform within the EU or suspend operations.
Enhanced Enforcement
- The planned “European Democracy Shield” will strengthen the DSA and impose even stricter regulations on online speech. Its stated aim is to protect the EU from foreign information manipulation and interference, particularly in the digital realm, focusing on the integrity of elections and political processes. Together with the DSA, it can be weaponized to target peaceful expression, further empowering unelected bureaucrats to censor.
- The DSA grants emergency powers that allow the European Commission to demand additional censorship measures from online platforms during times of crisis, without sufficiently precise definitions or limitations.
- Crisis is defined as “where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it” (Article 36(2)); “Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health” (para 91).
- The Commission may adopt a decision requiring very large platforms to take certain actions in response to the crisis: 1) assess how their services contribute to a serious threat, 2) apply measures to prevent, eliminate, or limit the threat, 3) report back to the Commission on those measures.
- The potential extraordinary measures it identifies are: “adapting content moderation processes and increasing the resources dedicated to content moderation, adapting terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with trusted flaggers, taking awareness-raising measures and promoting trusted information and adapting the design of their online interfaces”. (para 91)
- In a worst-case scenario, the European Commission could crack down on speech at will whenever it decrees a crisis and force platforms to “mitigate risks”. This would prevents citizens from accessing information and sharing views, handing extraordinary power to bureaucrats to control narratives in times of upheaval.
- Crisis is defined as “where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it” (Article 36(2)); “Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health” (para 91).

Is there recourse for a censored individual or platform forced to comply with the DSA?
The DSA severely limits the power of national courts to protect citizens’ free speech rights. National courts become the censorship long arm of the Commission. International appeal is possible but costly and onerous.
Appeal Options for Individuals
A censored individual can try to appeal directly to the platform, use a certified out-of-court dispute resolution mechanism, or appeal to the Digital Services Coordinator. While the out-of-court dispute settlement bodies offer a relatively easy appeal option (5 euros for the individual to submit), their decisions are not binding, and the platforms are only required to engage in good faith. If the platform does not, it leaves the individual user with only more expensive and lengthy judicial recourse. Faced with that reality, many are likely to just submit to censorship or preemptively self-censor.
Judicial Recourse
Individuals or the platform can technically challenge censorship in national courts, but the courts are required to comply with Commission decisions. Article 82 states: a “national court shall not take any decision which runs counter to that Commission decision. National courts shall also avoid taking decisions which could conflict with a decision contemplated by the Commission in proceedings”.
Individuals or platforms can take their cases to the Court of Justice of the European Union (CJEU), but this is a complex and costly process with strict requirements. The CJEU system takes 1-2 years for a ruling, sometimes longer, and rarely grants interim relief measures.
Is the DSA a problem only for Europe?
The DSA is a digital gag order with global consequences because it can censor you no matter where you live. Because the DSA applies to “Very Large Online Platforms” and search engines accessed within the EU but with a global presence, DSA censorship impacts the entire world.
Extraterritorial Applicability
The DSA explicitly states its extraterritorial applicability as it covers platforms used by people “that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services [the platforms] have their place of establishment”. (Article 2(1))
While the DSA states in Article 9(2)(b) that takedown orders should be “limited to what is strictly necessary to achieve its objective,” there remain grave extraterritorial concerns.
De Facto Global Censorship Standards
Platforms may be inclined to adapt their international content moderation policies to EU censorship. If platforms deem something “illegal” under EU rules, that content may be banned everywhere, even in countries with strong free speech protections.
In its letter to European Commissioner Henna Virkkunen, the U.S. House Judiciary Committee wrote: “Though nominally applicable to only EU speech, the DSA, as written, may limit or restrict Americans’ constitutionally protected speech in the United States. Companies that censor an insufficient amount of ‘misleading or deceptive’ speech—as defined by EU bureaucrats—face fines up to six percent of global revenue, which would amount to billions of dollars for many American companies. Furthermore, because many social media platforms generally maintain one set of content moderation policies that they apply globally, restrictive censorship laws like the DSA may set de facto global censorship standards.”
Europe in the Dark
Individuals outside of Europe could find themselves censored within Europe. This could happen to even a head of state or individual with enormous international reach. In the worst case, blocking content from reaching the 500 million inhabitants of the European Union has the potential to cut an entire continent out of the conversation—a draconian move with world-changing impact.
What is ADF International doing to challenge the DSA?
The DSA is irreconcilable with the human right to free speech. It must be repealed or substantially reformed to protect open discourse and fundamental freedoms in the EU and across the world. We cannot allow the DSA to become the global model for digital speech control.
ADF International is committed to challenging violations of free speech resulting from the DSA and building critical momentum to repeal or substantially reform this censorial framework. We are working to amend or strike down the parts of the DSA that undermine freedom of expression.
There is no disagreement that certain expression is illegal (e.g. child exploitation, incitement to terrorism) and every social media platform has a legal obligation to restrict this content. The DSA goes far beyond this. Instead, the DSA has created a censorship mega structure to ban “illegal content” without defining what “illegal content” is. Over time, this mega structure could censor speech that any person in any EU country considers “illegal” according to whatever law is either in force now or may be passed in the future. Behind the 100+ pages of complex legislation hides a blank cheque for censorship.
What can be done to challenge the DSA at the European level?
- Equip Member States to initiate an action for annulment before the CJEU – Articles 277 and 263 of the Treaty on the Functioning of the EU (TFEU): Grounds to invoke include the lack of competence of the Commission, an infringement of the Treaties and the EU Charter (free speech), and a misuse of powers. This could result in having the DSA or parts of it declared “inapplicable”.
- Mobilize Member States in the Council to repeal the DSA through a political decision: Repealing legislation once adopted is very difficult, and the procedure is similar to that for adopting the legislation. The Commission could initiate the repeal, but that appears politically unlikely. Instead, Member States in the Council can build a critical mass and take action.
- Preliminary reference procedure before the CJEU – Article 267 TFEU: In the course of national litigation, any party or the judge, ex officio, can raise a question of EU law, particularly on its interpretation. Such questions could include the conformity of the DSA (e.g., the definition of illegal content under Article 3(h) and the obligation to act against illegal content under Article 9(2)(b)) with Article 11 of the EU Charter (freedom of expression and information). The decision to submit the reference to the CJEU rests entirely with the national judge, except for the situation when the case is at the court of the last instance, and the question of interpretation of EU law is necessary to decide the legal question at issue.
- Engage in the DSA review process: According to Article 91 of the DSA, by 17 November 2025, the Commission shall evaluate and report to the European Parliament, the Council, and the European Economic and Social Committee. The scope of this first review is limited, and it will be followed by another review in 2027 and then every five years.