Unpacking the EU Digital Services Act

Man on his phone in a digital realm design
Man on his phone in a digital realm design

Given the impact of digital services on the online and offline world, states, or, in this case, a supranational union with delegated powers, are increasingly seeking to regulate this domain. We live in an age where Big Tech holds unprecedented power—the annual revenue of these giants economically places them ahead of many states’ annual budgets. The DSA is the EU’s first comprehensive and binding regulation of digital service providers in more than twenty years.

What is the Digital Services Act?

Although it purports to create “a safe online environment,” the DSA is among the most dangerous censorship regimes of the digital age.

The DSA is a legally binding regulatory framework that gives the European Commission authority to enforce “content moderation” on very large online platforms and search engines (those with more than 45 million users per month) that are established, or offer their services, in the EU.

Most of its provisions came into force in February 2024. Platforms that fail to comply with the regulation face massive financial penalties and even suspension. Through the platform’s compliance with the DSA, individuals can suffer censorship, suspension from online platforms, and criminal prosecution (under national law).

The stated objective of the DSA is “ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights enshrined in the Charter [of Fundamental Rights of the EU] are effectively protected, and innovation is facilitated”.

The Commission claims that the DSA creates “legal certainty,” “greater democratic control,” and “mitigation of systemic risks, such as manipulation or disinformation”—but, in reality, it is an authoritarian censorship regime antithetical to democracy.

Why is the DSA an extreme threat to fundamental freedoms?

The DSA requires platforms to censor “illegal content,” which it broadly defines as anything that is not in compliance with EU law or the law of any Member State (Article 3(h)). This could result in the lowest common denominator for censorship across the whole EU. Furthermore, authoritarian governments could adopt the blueprint, claiming that Western liberal states endorse it.

The DSA is deeply flawed. It is built on the idea that “bad speech” is best countered by censorship rather than robust discussion. Furthermore, the DSA gives the European Commission broad power over how platforms handle speech, which undermines the free expression essential to democratic societies.

If a censorship law such as the DSA is the “gold standard,” as the Commission has praised its own construct, authoritarian governments of the world will readily adopt the model.

Allowing “illegal content” to potentially be determined by one country’s vague and overreaching laws pits the DSA against international law standards that require any restrictions on speech to be precisely defined and necessary. This is extremely problematic given the increasing number of absurd so-called “hate speech” laws potentially criminalizing peaceful speech throughout Europe.

  • Example 1: Germany’s highly controversial NetzDG Law, enacted in 2017, forces digital service providers to enforce sweeping online restrictions on certain kinds of content, linking to provisions of the criminal code and including the broad offence of “insult”. A person in Germany could see something “insulting” online that they claim is illegal under German law, file a complaint under the DSA, and trigger a take-down of the content for all countries in the EU, including countries where “insult” is not a criminal offense.

  • Example 2: The DSA forces digital service providers to block specific people or messages, even those that come from outside the EU, from being heard by Europe. A Latin American president says something that a German believes violates German law. Under the DSA, that speech could be blocked (“content moderated”) from all EU countries.

How does the DSA censor speech?

The DSA is at the heart of Europe’s censorship industrial complex, consisting of a number of interwoven regulations and codes that give an unaccountable bureaucracy broad power to censor speech. Censorship occurs through vast “content moderation” networks coupled with a powerful enforcement mechanism to force platforms to comply.

“Content Moderation”

The unelected and largely unaccountable Commission has positioned itself under the DSA to enable sweeping censorship in the name of “public safety” and “democracy”. It does this through a complicated mega-structure that allows the Commission to pull the strings of censorship, making private enterprises complicit and forcing them to comply with the threat of draconian fines.

The DSA creates a censorship industrial complex consisting of an expansive web of outsourced content flaggers, national coordinators, monitoring reporters, and other authorities, with the European Commission at its head. This is a business model dependent on finding content to censor and inconsistent with the standards of the rule of law.

The structure is intentionally unnavigable for the regular citizen to determine what is allowable speech. As platforms have the obligation to moderate content, the Commission can hide behind the DSA to claim that it itself is not censoring speech.

The DSA applies directly to all Member States without requiring national implementation. National regulators work with existing legal frameworks, and they create new structures to apply the DSA alongside domestic laws. In the event of a conflict, the DSA overrides national laws.

Content is policed by so-called “trusted flaggers,” including NGOs and private entities, and may even include law enforcement agencies like Europol. This deputizes organizations with their own agendas to enforce censorship at scale.

This system of “flaggers” reports content that they deem “illegal” to the platform. The platform must prioritize flagged content for removal. If the platform deems the content illegal, it must quickly remove it or disable access (by geo-blocking or hiding visibility).

Very large platforms also are obligated to proactively prevent “illegal content” by conducting regular risk assessments to identify how their services may spread “illegal content”. Under Article 34, these include “negative effects on civic discourse and electoral processes, and public security” and “effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being”. The efforts include: adapting their design, terms and conditions, algorithmic systems, advertising, content moderation, including for “hate speech,” and awareness-raising measures.

Enforcement

A powerful enforcement mechanism ensures compliance. Under the threat of enormous financial penalties and suspension, digital service providers are forced to censor and potentially suspend individuals, and individuals may even be criminally prosecuted.

Penalties for Individual Users:

  • If, after content is flagged, the platform deems it illegal after its own review, it must remove it or disable access and notify the account.

  • If individuals persistently post “illegal content,” platforms can suspend their accounts (after having issued a warning and with an obligation to be proportionate and for a reasonable period of time).

  • Every Member State has a designated Digital Services Coordinator to enforce compliance with the DSA. The Coordinator can seek court orders to rule on the “illegal” nature of content on platforms and then fine and potentially suspend online platforms. If a user posts content that the platform suspects violates criminal laws in so far as it is “involving a threat to the life or safety of a person or persons” (Article 18(1)), the platform is required to notify the police, triggering potential domestic prosecution.

    • This could happen under one of the many over-broad “hate speech” criminal laws in Europe. If the “hate speech” was subjectively determined to threaten the life or safety of a person or persons, it is possible that even peaceful speech without a real threat could be prosecuted (e.g., if, in the case of Päivi Räsänen, someone argued that her Twitter bible post endangered those who identify as LGBT).

Penalties for Platforms

  • Platforms evaluate content under the threat of crippling fines with every incentive to censor and none to uphold free speech. They face little to no punishment for unjustly banning content and enormous penalties if they refuse to censor.

  • If a platform refuses to remove or restrict access to “illegal content” after it has been flagged—especially by a “trusted flagger” or regulatory authority—the platform may face serious repercussions.

  • The Digital Service Coordinators have broad powers to investigate platforms, issue orders, impose fines, and escalate cases to the European Commission. When dealing with very large platforms, the Commission can override the Coordinators at any time, giving it direct control over censorship enforcement. For these platforms, the Commission has the same powers as the Coordinators but lacks the requirement of “independence” to which the Coordinators are subject. (Article 50(2)).

  • The Commission or national regulators can impose fines of up to 6% of the platform’s global annual turnover for non-compliance, amounting to billions. If non-compliance persists, platforms may face periodic penalty payments. Finally, it can restrict access to the platform within the EU or suspend operations.

Enhanced Enforcement

  • The planned “European Democracy Shield” will strengthen the DSA and impose even stricter regulations on online speech. Its stated aim is to protect the EU from foreign information manipulation and interference, particularly in the digital realm, focusing on the integrity of elections and political processes. Together with the DSA, it can be weaponized to target peaceful expression, further empowering unelected bureaucrats to censor.

  • The DSA grants emergency powers that allow the European Commission to demand additional censorship measures from online platforms during times of crisis, without sufficiently precise definitions or limitations.

    • Crisis is defined as “where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it” (Article 36(2)); “Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health” (para 91).

    • The Commission may adopt a decision requiring very large platforms to take certain actions in response to the crisis: 1) assess how their services contribute to a serious threat, 2) apply measures to prevent, eliminate, or limit the threat, 3) report back to the Commission on those measures.

    • The potential extraordinary measures it identifies are: “adapting content moderation processes and increasing the resources dedicated to content moderation, adapting terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with trusted flaggers, taking awareness-raising measures and promoting trusted information and adapting the design of their online interfaces”. (para 91)

    • In a worst-case scenario, the European Commission could crack down on speech at will whenever it decrees a crisis and force platforms to “mitigate risks”. This would prevents citizens from accessing information and sharing views, handing extraordinary power to bureaucrats to control narratives in times of upheaval. 
Paul Coleman's quote concerning the EU and the US on the DSA and censorship.

Is there recourse for a censored individual or platform forced to comply with the DSA?

The DSA severely limits the power of national courts to protect citizens’ free speech rights. National courts become the censorship long arm of the Commission. International appeal is possible but costly and onerous.

Appeal Options for Individuals

A censored individual can try to appeal directly to the platform, use a certified out-of-court dispute resolution mechanism, or appeal to the Digital Services Coordinator. While the out-of-court dispute settlement bodies offer a relatively easy appeal option (5 euros for the individual to submit), their decisions are not binding, and the platforms are only required to engage in good faith. If the platform does not, it leaves the individual user with only more expensive and lengthy judicial recourse. Faced with that reality, many are likely to just submit to censorship or preemptively self-censor.

Judicial Recourse

Individuals or the platform can technically challenge censorship in national courts, but the courts are required to comply with Commission decisions. Article 82 states: a “national court shall not take any decision which runs counter to that Commission decision. National courts shall also avoid taking decisions which could conflict with a decision contemplated by the Commission in proceedings”.

Individuals or platforms can take their cases to the Court of Justice of the European Union (CJEU), but this is a complex and costly process with strict requirements. The CJEU system takes 1-2 years for a ruling, sometimes longer, and rarely grants interim relief measures.

Is the DSA a problem only for Europe?

The DSA is a digital gag order with global consequences because it can censor you no matter where you live. Because the DSA applies to “Very Large Online Platforms” and search engines accessed within the EU but with a global presence, DSA censorship impacts the entire world.

Extraterritorial Applicability

The DSA explicitly states its extraterritorial applicability as it covers platforms used by people “that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services [the platforms] have their place of establishment”. (Article 2(1))

While the DSA states in Article 9(2)(b) that takedown orders should be “limited to what is strictly necessary to achieve its objective,” there remain grave extraterritorial concerns.

De Facto Global Censorship Standards

Platforms may be inclined to adapt their international content moderation policies to EU censorship. If platforms deem something “illegal” under EU rules, that content may be banned everywhere, even in countries with strong free speech protections.

In its letter to European Commissioner Henna Virkkunen, the U.S. House Judiciary Committee wrote: “Though nominally applicable to only EU speech, the DSA, as written, may limit or restrict Americans’ constitutionally protected speech in the United States. Companies that censor an insufficient amount of ‘misleading or deceptive’ speech—as defined by EU bureaucrats—face fines up to six percent of global revenue, which would amount to billions of dollars for many American companies. Furthermore, because many social media platforms generally maintain one set of content moderation policies that they apply globally, restrictive censorship laws like the DSA may set de facto global censorship standards.”

Europe in the Dark

Individuals outside of Europe could find themselves censored within Europe. This could happen to even a head of state or individual with enormous international reach. In the worst case, blocking content from reaching the 500 million inhabitants of the European Union has the potential to cut an entire continent out of the conversation—a draconian move with world-changing impact.

What is ADF International doing to challenge the DSA?

The DSA is irreconcilable with the human right to free speech. It must be repealed or substantially reformed to protect open discourse and fundamental freedoms in the EU and across the world. We cannot allow the DSA to become the global model for digital speech control.

ADF International is committed to challenging violations of free speech resulting from the DSA and building critical momentum to repeal or substantially reform this censorial framework. We are working to amend or strike down the parts of the DSA that undermine freedom of expression.

There is no disagreement that certain expression is illegal (e.g. child exploitation, incitement to terrorism) and every social media platform has a legal obligation to restrict this content. The DSA goes far beyond this. Instead, the DSA has created a censorship mega structure to ban “illegal content” without defining what “illegal content” is. Over time, this mega structure could censor speech that any person in any EU country considers “illegal” according to whatever law is either in force now or may be passed in the future. Behind the 100+ pages of complex legislation hides a blank cheque for censorship.

What can be done to challenge the DSA at the European level?

  • Equip Member States to initiate an action for annulment before the CJEU – Articles 277 and 263 of the Treaty on the Functioning of the EU (TFEU): Grounds to invoke include the lack of competence of the Commission, an infringement of the Treaties and the EU Charter (free speech), and a misuse of powers. This could result in having the DSA or parts of it declared “inapplicable”.

  • Mobilize Member States in the Council to repeal the DSA through a political decision: Repealing legislation once adopted is very difficult, and the procedure is similar to that for adopting the legislation. The Commission could initiate the repeal, but that appears politically unlikely. Instead, Member States in the Council can build a critical mass and take action.

  • Preliminary reference procedure before the CJEU – Article 267 TFEU: In the course of national litigation, any party or the judge, ex officio, can raise a question of EU law, particularly on its interpretation. Such questions could include the conformity of the DSA (e.g., the definition of illegal content under Article 3(h) and the obligation to act against illegal content under Article 9(2)(b)) with Article 11 of the EU Charter (freedom of expression and information). The decision to submit the reference to the CJEU rests entirely with the national judge, except for the situation when the case is at the court of the last instance, and the question of interpretation of EU law is necessary to decide the legal question at issue.

  • Engage in the DSA review process: According to Article 91 of the DSA, by 17 November 2025, the Commission shall evaluate and report to the European Parliament, the Council, and the European Economic and Social Committee. The scope of this first review is limited, and it will be followed by another review in 2027 and then every five years.

How the EU Digital Services Act (DSA) Affects Online Free Speech in 2025

Dr. Adina Portaru is ADF International's EU Digital Services Act expert

Nicknamed the ‘Digital Surveillance Act’, the EU’s key online platform legislation hit its one-year mark in February 2025

Picture of Dr. Adina Portaru
Dr. Adina Portaru

Senior Counsel, Europe, ADF International

Dr. Adina Portaru is ADF International's EU Digital Services Act expert

The European Digital Services Act (DSA), which took effect last February, has been hailed as a landmark law designed to bring order to the digital world. Yet, beneath the surface of supposedly protecting democracy, lies a framework fraught with overreach, ambiguity, and the erosion of fundamental freedoms.

The EU Commission claims that the Digital Services Act is needed to “protect democracy” by tackling so-called “misinformation”, “disinformation” and “hate speech” online. It promises to create a safer online space by holding digital platforms—particularly “Very Large Online Platforms” (VLOPs) such as Google, Amazon, Meta and X—accountable for addressing these terms.

However, its implementation raises grave concerns. By mandating the removal of broadly defined “harmful” content, this legislation sets the stage for widespread censorship, curtailing lawful and truthful speech under the guise of compliance and safety. The result will be a sanitized and tightly controlled internet where the free exchange of ideas is stifled.

Ultimately, the EU Digital Services Act will allow the silencing of views online that are disfavoured by those in power.

Freedom of speech is the cornerstone of a democratic society and includes the right to voice unpopular or controversial opinions. For this reason, ADF International is committed to ensuring that the right to freedom of speech is firmly upheld.

The Implications on Free Speech

The Digital Services Act’s regulatory framework has profound implications for free speech. 

Under the DSA, tech platforms must act against “illegal content”, removing or blocking access to such material within a certain timeframe. However, the definition of “illegal content” is notably broad, encompassing vague terms like “hate speech”—a major part of the DSA’s focus.

The DSA relies on the EU Framework Decision of 28 November 2008, which defines “hate speech” as incitement to violence or hatred against a protected group of persons or a member of such a group. This circular definition of “hate speech” as incitement to hatred is problematic because it fails to specify what “hate” entails. 

Due to their vague and subjective nature, “hate speech” laws lead to inconsistent interpretation and enforcement, relying more on individual perception rather than clear, objective harm. Furthermore, the lack of a uniform definition at the EU level means that what is considered “illegal” in one country might be legal in another.

Given all this, tech platforms face the impossible task of enforcing uniform standards across the EU.

The effects of the DSA will not be confined to Europe. There are legitimate worries that the DSA could censor the speech of citizens worldwide, as tech companies may impose stricter content regulations globally to comply with European requirements.

How will the EU DSA impact your freedom of speech in 2025?

Big Tech Platforms

Tech platforms aren’t just removing clear violations—they’ve also started removing speech that could be flagged as “harmful”. If you post a political opinion or share a tweet that some might find offensive, it might get flagged by an algorithm. To avoid massive fines or penalties, platforms will err on the side of caution and remove your post, even if it’s perfectly lawful.

Platforms rely on the automated removal of “harmful” information. These tools are widely known to be inaccurate, often fail to consider context, and therefore flag important and legal content. And if it’s not the algorithms that flag your content, it may be regular users who disagree with what you’re saying.

Alleged “Hate Speech” Case

There are many instances in which “hate speech” laws have targeted individuals for peacefully expressing their views online, even before the DSA came into effect. ADF International is supporting the legal defence of Päivi Räsänen, a Finnish Parliamentarian and grandmother of 12, who stands criminally charged for “hate speech”.

Päivi shared her faith-based views on marriage and sexual ethics in a 2019 tweet, a radio show, and in a 2004 pamphlet that she wrote for her church, centred on the Biblical text “male and female he created them”.

Päivi endured two trials and years of public scrutiny before she was unanimously acquitted of “hate speech” charges by both the Helsinki District Court and the Court of Appeal. Despite her acquittal, the state prosecutor has appealed the case, taking it to the Finnish Supreme Court.

It’s obvious that these laws aren’t only about combatting hate and violence; rather, they may target any speech deemed controversial or that challenge the status quo.

You are currently viewing a placeholder content from YouTube. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Penalties for Non-Compliance with the EU Digital Services Act

The penalties for failing to comply with the EU Digital Services Act are severe.

Non-compliant platforms with more than 45 million active users could be fined up to 6% of their global annual turnover. For tech platforms like Google, Amazon, Meta, and X, this means billions of euros. So, even the biggest tech companies can’t afford to fall short of the DSA regulations.

If a platform repeatedly fails to comply with the DSA, the EU Commission can impose a temporary or permanent ban, which could result in the platform’s exclusion from the EU market entirely. For platforms that rely heavily on this market, this would mean losing access to one of the world’s largest digital markets.

The risks are high, and tech platforms will scramble to ensure they comply—sometimes at the expense of your fundamental right to free speech.

Section 230, the DSA, and the UK Online Safety Act

The US, the EU, and the UK take different approaches to regulating online speech. While Section 230 protects platforms from liability in the US, the Digital Services Act and the UK Online Safety Act enforce stricter content moderation rules, requiring platforms to remove “illegal” and “harmful” content or face severe penalties.

Below is a comparison of how each framework handles platform liability, free speech, and government oversight:

Feature USA (Section 230) EU (Digital Sservices Act) UK (Online Safety Act)
Legal Basis First Amendment protects free speech; Section 230 shields platforms from liability. EU regulation on transparency and accountability, resulting in content moderation. UK law regulating online content to prevent harm, with strict enforcement.
Platform Liability Section 230 protects platforms from liability for most user-generated content. Large platforms must remove illegal content or face penalties. Platforms must remove harmful but legal content or face fines.
"Hate Speech" Protected unless it incites imminent violence. Platforms must remove illegal "hate speech". Requires platforms to remove content deemed harmful, even if legal.
"Misinformation" Generally protected under free speech laws. Platforms must take action against "systemic risks" like "disinformation". Platforms must mitigate risks from "misinformation", especially for children.
Government Censorship The government cannot censor speech except in rare cases (e.g., incitement to violence). “Trusted flaggers” can flag content for removal, but independent oversight applies. The regulator (Ofcom) enforces rules, and platforms must comply.

“Shadow Content Banning”

In the digital age, we rely increasingly on digital technology to impart and receive information. And it’s essential that the free flow of information is not controlled by unaccountable gatekeepers policing what can and cannot be said.

ADF International’s stance is clear: this legislation will result in dangerous overreach that threatens the very freedoms it claims to protect.

In January, our legal team attended a plenary session and debate at the EU Parliament in Strasbourg regarding the enforcement of the DSA. The discussion brought to light significant concerns across the political spectrum about how the DSA may impact freedom of speech and expression, and rightfully so.

EU Parliament

Several members of the EU Parliament (MEPs), who initially favoured the legislation, raised serious objections to the DSA, with some calling for its revision or annulment. A significant point of contention was the potential for what they termed “shadow content banning”—removing content without adequate transparency.

This includes cases where users might be unaware of why their content was banned, on what legal basis, or how they can appeal such decisions. Most of the time, they’re left with nothing but a generic AI response and no explanation. 

Some MEPs, like French MEP Virginie Joron, referred to the DSA as the “Digital Surveillance Act”.

Despite intense opposition, the EU Commission representative and the Council of the EU representative promised to enforce the DSA more rigorously. They vowed to double down on free speech by enforcing more thorough fact-checking and anti “hate speech” laws “so that “hate speech” is flagged and assessed [within] 24 hours and removed when necessary”.

They failed to provide comprehensive responses to the concerns raised about the DSA’s potential to erode fundamental rights, leaving critical questions about its implementation and implications unresolved.

Conclusion: EU Digital Services Act or “Digital Surveillance Act”?

The EU Digital Services Act’s enforcement mechanisms are riddled with ambiguity. Terms like “misinformation,” “disinformation,” and “hate speech” are too wide and vague to serve as a proper basis for silencing speech. These terms are too easily weaponized, enabling those in power to police dialogue and suppress dissent in the name of safety.

By placing excessive pressure on platforms to moderate content, the DSA risks creating an internet governed by fear—fear of fines, fear of bans, and fear of expressing one’s views. If the DSA is allowed to stifle open dialogue and suppress legitimate debate, it will undermine the very democratic principles it claims to protect.

Policymakers must revisit this legislation, ensuring that efforts to regulate the digital sphere do not come at the cost of fundamental freedoms.

Europe’s commitment to freedom of speech demands better. Through our office in Brussels, we at ADF International are challenging this legislation because it’s not up to governments or unaccountable bureaucrats to impose a narrow view of acceptable speech on society.

EU doubles down on social media censorship that ‘will not be confined to Europe’ following concerns about Musk’s free speech policy on X

  • Members of the European Parliament debated controversial Digital Services Act on Tuesday, which censors free speech both within and outside the EU, and could affect America
  • EU’s censorship stance in marked contrast with US, where President Trump this week signed Executive Order to end government censorship

STRASBOURG (24 January) – The European Union this week doubled down on social media censorship to “protect democracy” from “foreign interference”, following concerns about Elon Musk’s free speech policy on X.

The Digital Services Act (DSA), which came into full force in February 2024, is an EU regulation that aims to tackle “misinformation”, “disinformation”, and “hate speech” online.

By requiring the removal of so-called “illegal content” on social media platforms, it censors free speech both within and outside the EU and could even affect the speech of US citizens online.

Members of the European Parliament (MEPs) debated enforcement of the controversial act on Tuesday. 

MEP Iratxe García, leader of the Progressive Alliance of Socialists and Democrats, commented:

“In recent months, we have seen how Elon Musk and his social network X have become the main promoter for the far right by supporting Donald Trump and Alice Weidel’s AfD party through fake news and hate messages.

We have also witnessed Mark Zuckerberg’s decision to remove fact-checking programs on Meta as an act of complicity with lies and manipulation… We must ensure the effective application of our rules and we must sanction those who break the rules.”

The European Commissioner in charge of enforcing the DSA, Henna Virkkunen, announced a number of measures to further crack down on speech, including doubling the number of staff working on enforcement from 100 to 200 by the end of 2025. 

“We are living in a new bipolar order of speech. On the one hand, Europe is doubling down on censorship, while the US is recommitting to its free speech heritage."

This puts the EU’s online free speech stance in stark opposition to that of the US, following President Trump this week signing an Executive Order to end government censorship.

Although Virkkunen claimed the DSA “does not censor content”, MEPs from across the political spectrum voiced well-founded concerns that, in fact, it does.

Hungarian MEP Schaller-Baross Ernő said:

Let’s call a spade a spade! In its current form, the DSA can also serve as a tool for political censorship…

“I’m afraid that in Europe the left… is not learning again. But this DSA must be abolished in this form. We don’t need more officials in Europe who censor…

“Freedom of expression and equal conditions must be ensured. This is the foundation of our democracy. Let’s say no to political censorship!”

Polish MEP Ewa Zajączkowska-Hernik said:

“For you, democracy is when people think, write and speak directly and say what you tell them to with your leftist way of thinking.

“Right-wing and conservative views are ‘thought crime’ and today’s debate should be called ‘The need to strengthen censorship to protect the trough of those who govern the European Union’.”

In addition to institutionalising censorship, the DSA also lays the ground for shadow banning, which was highlighted in this week’s debate.

Paul Coleman, executive director of ADF International, a global organisation dedicated to the protection of fundamental freedoms, including at the EU institutions, stated:

“On Monday, President Trump signed an executive order to end the weaponisation of the US government to promote censorship.

“On Tuesday, the European Commission made clear that it will be increasing its efforts to suppress speech, arguing that the Digital Services Act is needed to ‘protect democracy’ from so-called ‘misinformation’, ‘disinformation’ and ‘hate speech’ online.

“We are living in a new bipolar order of speech. On the one hand, Europe is doubling down on censorship, while the US is recommitting to its free speech heritage.

“This will usher in an unprecedented era of tension within the West itself over this most basic of human rights, and it is the responsibility of all who value freedom to side with the protection of free speech.

“As we saw clearly from Thierry Breton’s letter to Elon Musk this summer, warning him not to breach the DSA ahead of his interview with Trump, the DSA will be used to censor views disfavoured by those in power.

“The DSA poses a grave threat to the fundamental right to freedom of expression, guaranteed to every person under international law. It is not the place of any authority to impose a narrow view of acceptable speech on the rest of society.

“The effects of the DSA will not be confined to Europe. There are legitimate worries that the DSA could censor the speech of citizens across the world, as social media companies could regulate their content globally to comply with European standards.”

US Response to DSA

In response to former Commissioner Thierry Breton’s letter to Musk this summer, Congressman Jim Jordan, chairman of the US House Judiciary Committee, wrote a strongly worded letter to Mr Breton.

In it, he said:

“We write to reiterate our position that the EU’s burdensome regulation of online speech must not infringe on protected American speech…

“Your threats against free speech do not occur in a vacuum, and the consequences are not limited to Europe. The harms caused by EU-imposed censorship spill across international borders, as many platforms generally maintain one set of content moderation policies that they apply globally.

“Thus, the EU’s regulatory censorship regime may limit what content Americans can view in the United States. American companies also have an enormous incentive to comply with the DSA and public threats from EU commissioners like you.”

Increasing Censorship Efforts

Other measures announced by Virkkunen this week include making a previously voluntary code of conduct on “illegal hate speech online” legally binding and advancing a framework called the European Democracy Shield (EDS).

The EDS uses fact checkers and NGOs to combat so-called “foreign information manipulation, interference, and disinformation”.

Anyone, be it an individual or an entity, can flag content they believe to be illegal.

Under the DSA, social media platforms can face massive fines of up to 6% of global annual turnover for failing to remove so-called “misinformation”, “disinformation” and “hate speech”.

The concept of “hate speech” has no basis in international human rights law.

Because of their loose and vague nature, prohibitions on “hate speech” rely on the subjective perception of offended parties rather than objective harm.

Further, the definition of “hate speech” is not harmonised at the EU level, meaning that what is deemed illegal in one country may not be in another.

Images for free use in print or online in relation to this story only