Chile’s Congress Calls for Immediate Suspension of “Gender Transition” Programs for Children 

  • Chamber of Deputies adopts groundbreaking report calling for the immediate suspension of “gender transition” programs for minors and a full legislative overhaul in Chile 
  • Investigatory commission found off-label use of puberty blockers, lack of parental consent, and public funding for unapproved medical interventions in children 

SANTIAGO (16 May 2025) – In a landmark move, Chile’s Chamber of Deputies has adopted the findings of a Special Investigatory Commission calling for the immediate suspension of government programs that promote the medicalized transition of minors.

The Commission’s report was adopted on Thursday by the majority of the deputies present in the Chamber. It details systemic medical, legal, and ethical failings in the state’s handling of children and adolescents who experience gender-related distress. 

““Chile has become the first country in Latin America to confront the harms of the gender-affirming model through a democratic process."

 

“Chile has become the first country in Latin America to confront the harms of the gender-affirming model through a democratic process. Congress has taken a courageous step in protecting children from the irreversible dangers of so-called ‘gender transition’,” said Tomás Henríquez, Director of Latin America Advocacy for ADF International. 

“The Commission found that programs like PAIG – Crece con Orgullo and the Trans Health Program (PST) have operated as a pipeline to irreversible medical interventions, including puberty blockers and cross-sex hormones for children as young as ten, without scientific basis, regulatory oversight, or parental consent.”

The report highlights the following major findings: 

  • Children as young as three years old were referred for gender identity programs. 
  • 1,716 minors were identified as recipients or in line for hormone therapies in 2023 alone. 
  • None of the drugs used—including GnRH analogues—have been approved by Chile’s public health regulator for gender dysphoria in children. 
  • Parental consent procedures were absent or inconsistent, and some interventions proceeded without it. 

The report states: 

“It is clear that the current programs, under the guise of accompaniment, have operated as a gateway to irreversible medical and hormonal transition for children, without the necessary scientific, ethical, or legal safeguards.” 

 It further states: 

“The therapeutic indication of these treatments in minors lacks adequate evidence and carries high risks. The principles of medicine—primum non nocere (first, do no harm)—have been disregarded.” 

 The report calls for: 

  • Immediate suspension of the PAIG and PST programs 
  • A ban on hormonal and surgical interventions for all minors 
  • Legislative reform of Chile’s Gender Identity Law to restore parental rights and restrict access 
  • Referral to the Public Prosecutor’s Office for possible criminal violations 

It is widely expected that the Congress will now move to legislatively bar the use of puberty blockers, cross sex hormones, and so-called surgical transitioning for minors. 

The report follows last year’s vote introducing an amendment to ban the use of public funds for “gender transitioning” of children. That amendment was later struck down by the Constitutional Court over separation of powers grounds, but this vote reveals a consolidated majority of lawmakers in favor of restricting “gender transition” for minors. 

 Henríquez added: 

This is a turning point—not only for Chile, but for the entire region, in the disavowal of the lie of gender ideology. Lawmakers have listened to the evidence, the science, and the voices of parents. The so-called gender-affirming model is collapsing globally, and Chile is now leading Latin America toward a more responsible and ethical approach to gender dysphoria in youth.” 

The report mirrors international developments such as the UK’s Cass Review, which concluded that the so-called “gender-affirming approach” lacks an evidence base and places children at risk. It also follows the release of the US Department of Health and Human Services Gender Dysphoria Report in April. 

ADF International has urged Chilean authorities to implement the Commission’s recommendations without delay and to ensure that all children receive compassionate, evidence-based psychological support without being steered toward dangerous life-altering medical procedures. 

Images for free use in print or online in relation to this story only

PICTURED: Tomas Henriquez

As Albanese claims electoral victory, U.S. State Dept warns Australia: Don’t censor free speech on 𝕏

  •  U.S. State Dept. “deeply concerned” about foreign censorship on U.S. social media platforms – including Australia’s censorship of Canadian campaigner “Billboard Chris” (Chris Elston)
  • Elston legally challenged the Australian eSafety Commissioner for censoring his post on gender ideology last month. ADF International supported the case

MELBOURNE (5 May 2025)  – As the Labour Party claim victory in Australia’s election, the U.S. State Department’s Bureau of Democracy, Human Rights, & Labor has issued a warning to the government not to censor free speech on U.S. social media platforms.

Listed as an example of such “concerning” behaviour is the decision of the Australian eSafety Commissioner to require Musk’s 𝕏 to censor Canadian campaigner “Billboard Chris” (Chris Elston), who posted a criticism of gender ideology, and used biologically accurate pronouns to describe an Australian “transgender” activist, in a now “geo-blocked” post in 2024. 

Elston is a public campaigner against puberty blockers being given to children.

“If our free speech can't be protected when we speak out against the greatest child abuse scandal in the world right now, when can it be?”

The State Department’s statement, released on social media, reads:

The Department of State is deeply concerned about efforts by governments to coerce American tech companies into targeting individuals for censorship. Freedom of expression must be protected – online and offline.

“Examples of this conduct are troublingly numerous. EU Commissioner Thierry Breton threatened X for hosting political speech; Türkiye fined Meta for refusing to restrict content about protests; and Australia required X to remove a post criticizing an individual for promoting gender ideology.

“Even when content may be objectionable, censorship undermines democracy, suppresses political opponents, and degrades public safety. The United States opposes efforts to undermine freedom of expression. As @SecRubio said, our diplomacy will continue to place an emphasis on promoting fundamental freedoms.”

Reacting to the news of the State Department’s intervention, Chris Elston (“Billboard Chris”) said: 

“It’s tremendous to have the State Department support what we all know is true: free speech is a fundamental right, critical to a democratic society. 

If our free speech can’t be protected when we speak out against the greatest child abuse scandal in the world right now, when can it be?” 

Both 𝕏 and Billboard Chris, who was supported by ADF International and the Australian Human Rights Law Alliance, legally challenged the decision in Melbourne last month. The result is expected in the second half of this year. 

Australia censored post using biologically accurate pronouns to describe "transgender" activist

The Australian eSafety Commissioner defended the decision to censor Elston’s post before a Tribunal in Melbourne last month by arguing that a post using the biologically accurate pronouns of a transgender activist was “likely …intended to have an effect of causing serious harm” and should therefore be subject to state-enforced censorship, in accordance with Australia’s Online Safety Act.

The post in question, which was subject to a “removal notice” at the hands of the eSafety Commissioner in April 2024, shared a Daily Mail article headlined “Kinky secrets of UN trans expert REVEALED: Australian activist plugs bondage, bestiality, nudism, drugs, and tax-funded sex-change ops – so why is he writing health advice for the world body?” and which included pictures posted on social media by transgender activist, and WHO expert panel appointee, Teddy Cook.  

In February 2024, Canadian internet sensation and children’s safety campaigner “Billboard Chris” (Chris Elston), took to U.S. social media platform “X” to share the article, adding the comment: 

“This woman (yes, she’s female) is part of a panel of 20 ‘experts’ hired by the @WHO to draft their policy on caring for ‘trans people.” 

“People who belong in psychiatric wards are writing the guidelines for people who belong in psychiatric wards.” 

The takedown order was legally challenged by Elon Musk’s platform “X”, and by Elston. ADF International and the Human Rights Law Alliance are supporting Elston’s legal case.  

In his evidence, Elston told the Tribunal that while the first sentence of the tweet was a specific comment to the Daily Mail’s story on Teddy Cook, his second sentence was intended more broadly, to make a political comment about the ideological bias present amongst those in positions of power and influence when it comes to writing gender policy around the world. 

Speaking on the witness stand, Elston added: 

“It’s damaging to teach children they are born in the wrong body…children are beautiful just as they are. No drugs or scalpels needed.” 

Asked further about why he chose to post on this matter, Elston explained: “Because the World Health Organisation has global influence. We should have evidence-based care.” 

Freedom of political communication is protected as an implied right under the Australian Constitution. 

Robert Clarke, Director of Advocacy for ADF International, which is backing Elston’s legal defence, said: 

“The decision of Australian authorities to prevent Australian citizens from hearing and evaluating information about gender ideology is a patronizing affront to the principles of democracy.  

“The confidence of the Australian eSafety commissioner to censor citizens of Canada on an American platform, shows the truly global nature of the free speech crisis. 

“Speaking up for free speech is critical at this juncture, and we’re proud to be backing Billboard Chris as he does just that.”  

Members of the public are invited to support Chris’s legal case here: https://adfinternational.org/campaign/support-billboard-chris   

Images for free use in print or online in relation to this story only

Pictured: (1,2) “Billboard Chris” (Chris Elston)engaging in street activism; (3) Chris Elston with the ADF International team supporting his case; (4) Chris Elston with Lois McLatchie Miller (ADF International) in Sydney

U.S. State Department calls out Australian government for “coercing” Musk’s 𝕏 to censor truth on gender

  • “Censorship undermines democracy, suppresses political opponents, and degrades public safety”, reads U.S. State Dept. statement calling out Australia’s censorship of Canadian campaigner “Billboard Chris” (Chris Elston)
  • Elston legally challenged the Australian eSafety Commissioner for censoring his post on gender ideology last month. ADF International supported the case

Washington, D.C. (3 May 2025)  – The U.S. State Department’s Bureau of Democracy, Human Rights, & Labor has called out foreign governments who “coerce” U.S. social media platforms to censor users for speaking the truth.

Listed as an example of such “concerning” behaviour is the decision of the Australian eSafety Commissioner  to require Musk’s 𝕏 to censor Canadian campaigner “Billboard Chris” (Chris Elston), who posted a criticism of gender ideology, and used biologically accurate pronouns to describe an Australian “transgender” activist, in a now “geo-blocked” post in 2024. 

Elston is a public campaigner against puberty blockers being given to children.

“If our free speech can't be protected when we speak out against the greatest child abuse scandal in the world right now, when can it be?”

Reacting to the news of the State Department’s intervention, Chris Elston (“Billboard Chris”) said: 

“It’s tremendous to have the State Department support what we all know is true: free speech is a fundamental right, critical to a democratic society. 

If our free speech can’t be protected when we speak out against the greatest child abuse scandal in the world right now, when can it be?” 

Both 𝕏 and Billboard Chris, who was supported by ADF International and the Australian Human Rights Law Alliance, legally challenged the decision in Melbourne last month. The result is expected in the second half of this year. 

The State Department’s statement, released on social media, reads:

The Department of State is deeply concerned about efforts by governments to coerce American tech companies into targeting individuals for censorship. Freedom of expression must be protected – online and offline.

“Examples of this conduct are troublingly numerous. EU Commissioner Thierry Breton threatened X for hosting political speech; Türkiye fined Meta for refusing to restrict content about protests; and Australia required X to remove a post criticizing an individual for promoting gender ideology.

“Even when content may be objectionable, censorship undermines democracy, suppresses political opponents, and degrades public safety. The United States opposes efforts to undermine freedom of expression. As @SecRubio said, our diplomacy will continue to place an emphasis on promoting fundamental freedoms.”

Australia censored post using biologically accurate pronouns to describe "transgender" activist

The Australian eSafety Commissioner defended the decision to censor Elston’s post before a Tribunal in Melbourne last month by arguing that a post using the biologically accurate pronouns of a transgender activist was “likely …intended to have an effect of causing serious harm” and should therefore be subject to state-enforced censorship, in accordance with Australia’s Online Safety Act.

The post in question, which was subject to a “removal notice” at the hands of the eSafety Commissioner in April 2024, shared a Daily Mail article headlined “Kinky secrets of UN trans expert REVEALED: Australian activist plugs bondage, bestiality, nudism, drugs, and tax-funded sex-change ops – so why is he writing health advice for the world body?” and which included pictures posted on social media by transgender activist, and WHO expert panel appointee, Teddy Cook.  

In February 2024, Canadian internet sensation and children’s safety campaigner “Billboard Chris” (Chris Elston), took to U.S. social media platform “X” to share the article, adding the comment: 

“This woman (yes, she’s female) is part of a panel of 20 ‘experts’ hired by the @WHO to draft their policy on caring for ‘trans people.” 

“People who belong in psychiatric wards are writing the guidelines for people who belong in psychiatric wards.” 

The takedown order was legally challenged by Elon Musk’s platform “X”, and by Elston. ADF International and the Human Rights Law Alliance are supporting Elston’s legal case.  

In his evidence, Elston told the Tribunal that while the first sentence of the tweet was a specific comment to the Daily Mail’s story on Teddy Cook, his second sentence was intended more broadly, to make a political comment about the ideological bias present amongst those in positions of power and influence when it comes to writing gender policy around the world. 

Speaking on the witness stand, Elston added: 

“It’s damaging to teach children they are born in the wrong body…children are beautiful just as they are. No drugs or scalpels needed.” 

Asked further about why he chose to post on this matter, Elston explained: “Because the World Health Organisation has global influence. We should have evidence-based care.” 

Freedom of political communication is protected as an implied right under the Australian Constitution. 

Robert Clarke, Director of Advocacy for ADF International, which is backing Elston’s legal defence, said: 

“The decision of Australian authorities to prevent Australian citizens from hearing and evaluating information about gender ideology is a patronizing affront to the principles of democracy.  

“The confidence of the Australian eSafety commissioner to censor citizens of Canada on an American platform, shows the truly global nature of the free speech crisis. 

“Speaking up for free speech is critical at this juncture, and we’re proud to be backing Billboard Chris as he does just that.”  

Members of the public are invited to support Chris’s legal case here: https://adfinternational.org/campaign/support-billboard-chris   

Images for free use in print or online in relation to this story only

Pictured: “Billboard Chris” (Chris Elston); Chris Elston with the ADF International team supporting his case

Unpacking the EU Digital Services Act

Man on his phone in a digital realm design
Man on his phone in a digital realm design

Given the impact of digital services on the online and offline world, states, or, in this case, a supranational union with delegated powers, are increasingly seeking to regulate this domain. We live in an age where Big Tech holds unprecedented power—the annual revenue of these giants economically places them ahead of many states’ annual budgets. The DSA is the EU’s first comprehensive and binding regulation of digital service providers in more than twenty years.

What is the Digital Services Act?

Although it purports to create “a safe online environment,” the DSA is among the most dangerous censorship regimes of the digital age.

The DSA is a legally binding regulatory framework that gives the European Commission authority to enforce “content moderation” on very large online platforms and search engines (those with more than 45 million users per month) that are established, or offer their services, in the EU.

Most of its provisions came into force in February 2024. Platforms that fail to comply with the regulation face massive financial penalties and even suspension. Through the platform’s compliance with the DSA, individuals can suffer censorship, suspension from online platforms, and criminal prosecution (under national law).

The stated objective of the DSA is “ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights enshrined in the Charter [of Fundamental Rights of the EU] are effectively protected, and innovation is facilitated”.

The Commission claims that the DSA creates “legal certainty,” “greater democratic control,” and “mitigation of systemic risks, such as manipulation or disinformation”—but, in reality, it is an authoritarian censorship regime antithetical to democracy.

Why is the DSA an extreme threat to fundamental freedoms?

The DSA requires platforms to censor “illegal content,” which it broadly defines as anything that is not in compliance with EU law or the law of any Member State (Article 3(h)). This could result in the lowest common denominator for censorship across the whole EU. Furthermore, authoritarian governments could adopt the blueprint, claiming that Western liberal states endorse it.

The DSA is deeply flawed. It is built on the idea that “bad speech” is best countered by censorship rather than robust discussion. Furthermore, the DSA gives the European Commission broad power over how platforms handle speech, which undermines the free expression essential to democratic societies.

If a censorship law such as the DSA is the “gold standard,” as the Commission has praised its own construct, authoritarian governments of the world will readily adopt the model.

Allowing “illegal content” to potentially be determined by one country’s vague and overreaching laws pits the DSA against international law standards that require any restrictions on speech to be precisely defined and necessary. This is extremely problematic given the increasing number of absurd so-called “hate speech” laws potentially criminalizing peaceful speech throughout Europe.

  • Example 1: Germany’s highly controversial NetzDG Law, enacted in 2017, forces digital service providers to enforce sweeping online restrictions on certain kinds of content, linking to provisions of the criminal code and including the broad offence of “insult”. A person in Germany could see something “insulting” online that they claim is illegal under German law, file a complaint under the DSA, and trigger a take-down of the content for all countries in the EU, including countries where “insult” is not a criminal offense.

  • Example 2: The DSA forces digital service providers to block specific people or messages, even those that come from outside the EU, from being heard by Europe. A Latin American president says something that a German believes violates German law. Under the DSA, that speech could be blocked (“content moderated”) from all EU countries.

How does the DSA censor speech?

The DSA is at the heart of Europe’s censorship industrial complex, consisting of a number of interwoven regulations and codes that give an unaccountable bureaucracy broad power to censor speech. Censorship occurs through vast “content moderation” networks coupled with a powerful enforcement mechanism to force platforms to comply.

“Content Moderation”

The unelected and largely unaccountable Commission has positioned itself under the DSA to enable sweeping censorship in the name of “public safety” and “democracy”. It does this through a complicated mega-structure that allows the Commission to pull the strings of censorship, making private enterprises complicit and forcing them to comply with the threat of draconian fines.

The DSA creates a censorship industrial complex consisting of an expansive web of outsourced content flaggers, national coordinators, monitoring reporters, and other authorities, with the European Commission at its head. This is a business model dependent on finding content to censor and inconsistent with the standards of the rule of law.

The structure is intentionally unnavigable for the regular citizen to determine what is allowable speech. As platforms have the obligation to moderate content, the Commission can hide behind the DSA to claim that it itself is not censoring speech.

The DSA applies directly to all Member States without requiring national implementation. National regulators work with existing legal frameworks, and they create new structures to apply the DSA alongside domestic laws. In the event of a conflict, the DSA overrides national laws.

Content is policed by so-called “trusted flaggers,” including NGOs and private entities, and may even include law enforcement agencies like Europol. This deputizes organizations with their own agendas to enforce censorship at scale.

This system of “flaggers” reports content that they deem “illegal” to the platform. The platform must prioritize flagged content for removal. If the platform deems the content illegal, it must quickly remove it or disable access (by geo-blocking or hiding visibility).

Very large platforms also are obligated to proactively prevent “illegal content” by conducting regular risk assessments to identify how their services may spread “illegal content”. Under Article 34, these include “negative effects on civic discourse and electoral processes, and public security” and “effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being”. The efforts include: adapting their design, terms and conditions, algorithmic systems, advertising, content moderation, including for “hate speech,” and awareness-raising measures.

Enforcement

A powerful enforcement mechanism ensures compliance. Under the threat of enormous financial penalties and suspension, digital service providers are forced to censor and potentially suspend individuals, and individuals may even be criminally prosecuted.

Penalties for Individual Users:

  • If, after content is flagged, the platform deems it illegal after its own review, it must remove it or disable access and notify the account.

  • If individuals persistently post “illegal content,” platforms can suspend their accounts (after having issued a warning and with an obligation to be proportionate and for a reasonable period of time).

  • Every Member State has a designated Digital Services Coordinator to enforce compliance with the DSA. The Coordinator can seek court orders to rule on the “illegal” nature of content on platforms and then fine and potentially suspend online platforms. If a user posts content that the platform suspects violates criminal laws in so far as it is “involving a threat to the life or safety of a person or persons” (Article 18(1)), the platform is required to notify the police, triggering potential domestic prosecution.

    • This could happen under one of the many over-broad “hate speech” criminal laws in Europe. If the “hate speech” was subjectively determined to threaten the life or safety of a person or persons, it is possible that even peaceful speech without a real threat could be prosecuted (e.g., if, in the case of Päivi Räsänen, someone argued that her Twitter bible post endangered those who identify as LGBT).

Penalties for Platforms

  • Platforms evaluate content under the threat of crippling fines with every incentive to censor and none to uphold free speech. They face little to no punishment for unjustly banning content and enormous penalties if they refuse to censor.

  • If a platform refuses to remove or restrict access to “illegal content” after it has been flagged—especially by a “trusted flagger” or regulatory authority—the platform may face serious repercussions.

  • The Digital Service Coordinators have broad powers to investigate platforms, issue orders, impose fines, and escalate cases to the European Commission. When dealing with very large platforms, the Commission can override the Coordinators at any time, giving it direct control over censorship enforcement. For these platforms, the Commission has the same powers as the Coordinators but lacks the requirement of “independence” to which the Coordinators are subject. (Article 50(2)).

  • The Commission or national regulators can impose fines of up to 6% of the platform’s global annual turnover for non-compliance, amounting to billions. If non-compliance persists, platforms may face periodic penalty payments. Finally, it can restrict access to the platform within the EU or suspend operations.

Enhanced Enforcement

  • The planned “European Democracy Shield” will strengthen the DSA and impose even stricter regulations on online speech. Its stated aim is to protect the EU from foreign information manipulation and interference, particularly in the digital realm, focusing on the integrity of elections and political processes. Together with the DSA, it can be weaponized to target peaceful expression, further empowering unelected bureaucrats to censor.

  • The DSA grants emergency powers that allow the European Commission to demand additional censorship measures from online platforms during times of crisis, without sufficiently precise definitions or limitations.

    • Crisis is defined as “where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it” (Article 36(2)); “Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health” (para 91).

    • The Commission may adopt a decision requiring very large platforms to take certain actions in response to the crisis: 1) assess how their services contribute to a serious threat, 2) apply measures to prevent, eliminate, or limit the threat, 3) report back to the Commission on those measures.

    • The potential extraordinary measures it identifies are: “adapting content moderation processes and increasing the resources dedicated to content moderation, adapting terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with trusted flaggers, taking awareness-raising measures and promoting trusted information and adapting the design of their online interfaces”. (para 91)

    • In a worst-case scenario, the European Commission could crack down on speech at will whenever it decrees a crisis and force platforms to “mitigate risks”. This would prevents citizens from accessing information and sharing views, handing extraordinary power to bureaucrats to control narratives in times of upheaval. 
Paul Coleman's quote concerning the EU and the US on the DSA and censorship.

Is there recourse for a censored individual or platform forced to comply with the DSA?

The DSA severely limits the power of national courts to protect citizens’ free speech rights. National courts become the censorship long arm of the Commission. International appeal is possible but costly and onerous.

Appeal Options for Individuals

A censored individual can try to appeal directly to the platform, use a certified out-of-court dispute resolution mechanism, or appeal to the Digital Services Coordinator. While the out-of-court dispute settlement bodies offer a relatively easy appeal option (5 euros for the individual to submit), their decisions are not binding, and the platforms are only required to engage in good faith. If the platform does not, it leaves the individual user with only more expensive and lengthy judicial recourse. Faced with that reality, many are likely to just submit to censorship or preemptively self-censor.

Judicial Recourse

Individuals or the platform can technically challenge censorship in national courts, but the courts are required to comply with Commission decisions. Article 82 states: a “national court shall not take any decision which runs counter to that Commission decision. National courts shall also avoid taking decisions which could conflict with a decision contemplated by the Commission in proceedings”.

Individuals or platforms can take their cases to the Court of Justice of the European Union (CJEU), but this is a complex and costly process with strict requirements. The CJEU system takes 1-2 years for a ruling, sometimes longer, and rarely grants interim relief measures.

Is the DSA a problem only for Europe?

The DSA is a digital gag order with global consequences because it can censor you no matter where you live. Because the DSA applies to “Very Large Online Platforms” and search engines accessed within the EU but with a global presence, DSA censorship impacts the entire world.

Extraterritorial Applicability

The DSA explicitly states its extraterritorial applicability as it covers platforms used by people “that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services [the platforms] have their place of establishment”. (Article 2(1))

While the DSA states in Article 9(2)(b) that takedown orders should be “limited to what is strictly necessary to achieve its objective,” there remain grave extraterritorial concerns.

De Facto Global Censorship Standards

Platforms may be inclined to adapt their international content moderation policies to EU censorship. If platforms deem something “illegal” under EU rules, that content may be banned everywhere, even in countries with strong free speech protections.

In its letter to European Commissioner Henna Virkkunen, the U.S. House Judiciary Committee wrote: “Though nominally applicable to only EU speech, the DSA, as written, may limit or restrict Americans’ constitutionally protected speech in the United States. Companies that censor an insufficient amount of ‘misleading or deceptive’ speech—as defined by EU bureaucrats—face fines up to six percent of global revenue, which would amount to billions of dollars for many American companies. Furthermore, because many social media platforms generally maintain one set of content moderation policies that they apply globally, restrictive censorship laws like the DSA may set de facto global censorship standards.”

Europe in the Dark

Individuals outside of Europe could find themselves censored within Europe. This could happen to even a head of state or individual with enormous international reach. In the worst case, blocking content from reaching the 500 million inhabitants of the European Union has the potential to cut an entire continent out of the conversation—a draconian move with world-changing impact.

What is ADF International doing to challenge the DSA?

The DSA is irreconcilable with the human right to free speech. It must be repealed or substantially reformed to protect open discourse and fundamental freedoms in the EU and across the world. We cannot allow the DSA to become the global model for digital speech control.

ADF International is committed to challenging violations of free speech resulting from the DSA and building critical momentum to repeal or substantially reform this censorial framework. We are working to amend or strike down the parts of the DSA that undermine freedom of expression.

There is no disagreement that certain expression is illegal (e.g. child exploitation, incitement to terrorism) and every social media platform has a legal obligation to restrict this content. The DSA goes far beyond this. Instead, the DSA has created a censorship mega structure to ban “illegal content” without defining what “illegal content” is. Over time, this mega structure could censor speech that any person in any EU country considers “illegal” according to whatever law is either in force now or may be passed in the future. Behind the 100+ pages of complex legislation hides a blank cheque for censorship.

What can be done to challenge the DSA at the European level?

  • Equip Member States to initiate an action for annulment before the CJEU – Articles 277 and 263 of the Treaty on the Functioning of the EU (TFEU): Grounds to invoke include the lack of competence of the Commission, an infringement of the Treaties and the EU Charter (free speech), and a misuse of powers. This could result in having the DSA or parts of it declared “inapplicable”.

  • Mobilize Member States in the Council to repeal the DSA through a political decision: Repealing legislation once adopted is very difficult, and the procedure is similar to that for adopting the legislation. The Commission could initiate the repeal, but that appears politically unlikely. Instead, Member States in the Council can build a critical mass and take action.

  • Preliminary reference procedure before the CJEU – Article 267 TFEU: In the course of national litigation, any party or the judge, ex officio, can raise a question of EU law, particularly on its interpretation. Such questions could include the conformity of the DSA (e.g., the definition of illegal content under Article 3(h) and the obligation to act against illegal content under Article 9(2)(b)) with Article 11 of the EU Charter (freedom of expression and information). The decision to submit the reference to the CJEU rests entirely with the national judge, except for the situation when the case is at the court of the last instance, and the question of interpretation of EU law is necessary to decide the legal question at issue.

  • Engage in the DSA review process: According to Article 91 of the DSA, by 17 November 2025, the Commission shall evaluate and report to the European Parliament, the Council, and the European Economic and Social Committee. The scope of this first review is limited, and it will be followed by another review in 2027 and then every five years.

United States of America (50th Session)

This report highlights the urgent need for the United States of America to take decisive action to protect children by respecting the fundamental human right and responsibility of parents to make decisions concerning the care, custody, and control of their children, particularly their right to protect their children from efforts to impose gender ideology in education and healthcare.

Continue reading

Maldives (50th Session)

This report addresses the state of freedom of religion or belief and freedom of expression in the Maldives, highlighting existing restrictions and government policies that contravene its obligations under international human rights law.

Continue reading

Libya (50th Session)

This report addresses the situation of religious minorities in Libya, including the impact of laws prohibiting “offenses against religion”, such as blasphemy, or restricting the peaceful propagation of one’s religion and other forms of expression deemed to create societal harm. It also highlights the de facto criminalization of apostasy, punishable by death, and the consequences of political instability and insecurity resulting from the targeted activities of violent militias and extremist groups against vulnerable religious and other minorities.

Continue reading

Honduras (50th Session)

This submission reports on the implementation of school-based sex education programmes in Honduras, highlighting the country’s shortcomings in meeting its obligations under international human rights law regarding the rights of the child and parental rights.

Continue reading