Unpacking the EU Digital Services Act

Man on his phone in a digital realm design
Man on his phone in a digital realm design

Given the impact of digital services on the online and offline world, states, or, in this case, a supranational union with delegated powers, are increasingly seeking to regulate this domain. We live in an age where Big Tech holds unprecedented power—the annual revenue of these giants economically places them ahead of many states’ annual budgets. The DSA is the EU’s first comprehensive and binding regulation of digital service providers in more than twenty years.

What is the Digital Services Act?

Although it purports to create “a safe online environment,” the DSA is among the most dangerous censorship regimes of the digital age.

The DSA is a legally binding regulatory framework that gives the European Commission authority to enforce “content moderation” on very large online platforms and search engines (those with more than 45 million users per month) that are established, or offer their services, in the EU.

Most of its provisions came into force in February 2024. Platforms that fail to comply with the regulation face massive financial penalties and even suspension. Through the platform’s compliance with the DSA, individuals can suffer censorship, suspension from online platforms, and criminal prosecution (under national law).

The stated objective of the DSA is “ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights enshrined in the Charter [of Fundamental Rights of the EU] are effectively protected, and innovation is facilitated”.

The Commission claims that the DSA creates “legal certainty,” “greater democratic control,” and “mitigation of systemic risks, such as manipulation or disinformation”—but, in reality, it is an authoritarian censorship regime antithetical to democracy.

Why is the DSA an extreme threat to fundamental freedoms?

The DSA requires platforms to censor “illegal content,” which it broadly defines as anything that is not in compliance with EU law or the law of any Member State (Article 3(h)). This could result in the lowest common denominator for censorship across the whole EU. Furthermore, authoritarian governments could adopt the blueprint, claiming that Western liberal states endorse it.

The DSA is deeply flawed. It is built on the idea that “bad speech” is best countered by censorship rather than robust discussion. Furthermore, the DSA gives the European Commission broad power over how platforms handle speech, which undermines the free expression essential to democratic societies.

If a censorship law such as the DSA is the “gold standard,” as the Commission has praised its own construct, authoritarian governments of the world will readily adopt the model.

Allowing “illegal content” to potentially be determined by one country’s vague and overreaching laws pits the DSA against international law standards that require any restrictions on speech to be precisely defined and necessary. This is extremely problematic given the increasing number of absurd so-called “hate speech” laws potentially criminalizing peaceful speech throughout Europe.

  • Example 1: Germany’s highly controversial NetzDG Law, enacted in 2017, forces digital service providers to enforce sweeping online restrictions on certain kinds of content, linking to provisions of the criminal code and including the broad offence of “insult”. A person in Germany could see something “insulting” online that they claim is illegal under German law, file a complaint under the DSA, and trigger a take-down of the content for all countries in the EU, including countries where “insult” is not a criminal offense.

  • Example 2: The DSA forces digital service providers to block specific people or messages, even those that come from outside the EU, from being heard by Europe. A Latin American president says something that a German believes violates German law. Under the DSA, that speech could be blocked (“content moderated”) from all EU countries.

How does the DSA censor speech?

The DSA is at the heart of Europe’s censorship industrial complex, consisting of a number of interwoven regulations and codes that give an unaccountable bureaucracy broad power to censor speech. Censorship occurs through vast “content moderation” networks coupled with a powerful enforcement mechanism to force platforms to comply.

“Content Moderation”

The unelected and largely unaccountable Commission has positioned itself under the DSA to enable sweeping censorship in the name of “public safety” and “democracy”. It does this through a complicated mega-structure that allows the Commission to pull the strings of censorship, making private enterprises complicit and forcing them to comply with the threat of draconian fines.

The DSA creates a censorship industrial complex consisting of an expansive web of outsourced content flaggers, national coordinators, monitoring reporters, and other authorities, with the European Commission at its head. This is a business model dependent on finding content to censor and inconsistent with the standards of the rule of law.

The structure is intentionally unnavigable for the regular citizen to determine what is allowable speech. As platforms have the obligation to moderate content, the Commission can hide behind the DSA to claim that it itself is not censoring speech.

The DSA applies directly to all Member States without requiring national implementation. National regulators work with existing legal frameworks, and they create new structures to apply the DSA alongside domestic laws. In the event of a conflict, the DSA overrides national laws.

Content is policed by so-called “trusted flaggers,” including NGOs and private entities, and may even include law enforcement agencies like Europol. This deputizes organizations with their own agendas to enforce censorship at scale.

This system of “flaggers” reports content that they deem “illegal” to the platform. The platform must prioritize flagged content for removal. If the platform deems the content illegal, it must quickly remove it or disable access (by geo-blocking or hiding visibility).

Very large platforms also are obligated to proactively prevent “illegal content” by conducting regular risk assessments to identify how their services may spread “illegal content”. Under Article 34, these include “negative effects on civic discourse and electoral processes, and public security” and “effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being”. The efforts include: adapting their design, terms and conditions, algorithmic systems, advertising, content moderation, including for “hate speech,” and awareness-raising measures.

Enforcement

A powerful enforcement mechanism ensures compliance. Under the threat of enormous financial penalties and suspension, digital service providers are forced to censor and potentially suspend individuals, and individuals may even be criminally prosecuted.

Penalties for Individual Users:

  • If, after content is flagged, the platform deems it illegal after its own review, it must remove it or disable access and notify the account.

  • If individuals persistently post “illegal content,” platforms can suspend their accounts (after having issued a warning and with an obligation to be proportionate and for a reasonable period of time).

  • Every Member State has a designated Digital Services Coordinator to enforce compliance with the DSA. The Coordinator can seek court orders to rule on the “illegal” nature of content on platforms and then fine and potentially suspend online platforms. If a user posts content that the platform suspects violates criminal laws in so far as it is “involving a threat to the life or safety of a person or persons” (Article 18(1)), the platform is required to notify the police, triggering potential domestic prosecution.

    • This could happen under one of the many over-broad “hate speech” criminal laws in Europe. If the “hate speech” was subjectively determined to threaten the life or safety of a person or persons, it is possible that even peaceful speech without a real threat could be prosecuted (e.g., if, in the case of Päivi Räsänen, someone argued that her Twitter bible post endangered those who identify as LGBT).

Penalties for Platforms

  • Platforms evaluate content under the threat of crippling fines with every incentive to censor and none to uphold free speech. They face little to no punishment for unjustly banning content and enormous penalties if they refuse to censor.

  • If a platform refuses to remove or restrict access to “illegal content” after it has been flagged—especially by a “trusted flagger” or regulatory authority—the platform may face serious repercussions.

  • The Digital Service Coordinators have broad powers to investigate platforms, issue orders, impose fines, and escalate cases to the European Commission. When dealing with very large platforms, the Commission can override the Coordinators at any time, giving it direct control over censorship enforcement. For these platforms, the Commission has the same powers as the Coordinators but lacks the requirement of “independence” to which the Coordinators are subject. (Article 50(2)).

  • The Commission or national regulators can impose fines of up to 6% of the platform’s global annual turnover for non-compliance, amounting to billions. If non-compliance persists, platforms may face periodic penalty payments. Finally, it can restrict access to the platform within the EU or suspend operations.

Enhanced Enforcement

  • The planned “European Democracy Shield” will strengthen the DSA and impose even stricter regulations on online speech. Its stated aim is to protect the EU from foreign information manipulation and interference, particularly in the digital realm, focusing on the integrity of elections and political processes. Together with the DSA, it can be weaponized to target peaceful expression, further empowering unelected bureaucrats to censor.

  • The DSA grants emergency powers that allow the European Commission to demand additional censorship measures from online platforms during times of crisis, without sufficiently precise definitions or limitations.

    • Crisis is defined as “where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it” (Article 36(2)); “Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health” (para 91).

    • The Commission may adopt a decision requiring very large platforms to take certain actions in response to the crisis: 1) assess how their services contribute to a serious threat, 2) apply measures to prevent, eliminate, or limit the threat, 3) report back to the Commission on those measures.

    • The potential extraordinary measures it identifies are: “adapting content moderation processes and increasing the resources dedicated to content moderation, adapting terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with trusted flaggers, taking awareness-raising measures and promoting trusted information and adapting the design of their online interfaces”. (para 91)

    • In a worst-case scenario, the European Commission could crack down on speech at will whenever it decrees a crisis and force platforms to “mitigate risks”. This would prevents citizens from accessing information and sharing views, handing extraordinary power to bureaucrats to control narratives in times of upheaval. 
Paul Coleman's quote concerning the EU and the US on the DSA and censorship.

Is there recourse for a censored individual or platform forced to comply with the DSA?

The DSA severely limits the power of national courts to protect citizens’ free speech rights. National courts become the censorship long arm of the Commission. International appeal is possible but costly and onerous.

Appeal Options for Individuals

A censored individual can try to appeal directly to the platform, use a certified out-of-court dispute resolution mechanism, or appeal to the Digital Services Coordinator. While the out-of-court dispute settlement bodies offer a relatively easy appeal option (5 euros for the individual to submit), their decisions are not binding, and the platforms are only required to engage in good faith. If the platform does not, it leaves the individual user with only more expensive and lengthy judicial recourse. Faced with that reality, many are likely to just submit to censorship or preemptively self-censor.

Judicial Recourse

Individuals or the platform can technically challenge censorship in national courts, but the courts are required to comply with Commission decisions. Article 82 states: a “national court shall not take any decision which runs counter to that Commission decision. National courts shall also avoid taking decisions which could conflict with a decision contemplated by the Commission in proceedings”.

Individuals or platforms can take their cases to the Court of Justice of the European Union (CJEU), but this is a complex and costly process with strict requirements. The CJEU system takes 1-2 years for a ruling, sometimes longer, and rarely grants interim relief measures.

Is the DSA a problem only for Europe?

The DSA is a digital gag order with global consequences because it can censor you no matter where you live. Because the DSA applies to “Very Large Online Platforms” and search engines accessed within the EU but with a global presence, DSA censorship impacts the entire world.

Extraterritorial Applicability

The DSA explicitly states its extraterritorial applicability as it covers platforms used by people “that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services [the platforms] have their place of establishment”. (Article 2(1))

While the DSA states in Article 9(2)(b) that takedown orders should be “limited to what is strictly necessary to achieve its objective,” there remain grave extraterritorial concerns.

De Facto Global Censorship Standards

Platforms may be inclined to adapt their international content moderation policies to EU censorship. If platforms deem something “illegal” under EU rules, that content may be banned everywhere, even in countries with strong free speech protections.

In its letter to European Commissioner Henna Virkkunen, the U.S. House Judiciary Committee wrote: “Though nominally applicable to only EU speech, the DSA, as written, may limit or restrict Americans’ constitutionally protected speech in the United States. Companies that censor an insufficient amount of ‘misleading or deceptive’ speech—as defined by EU bureaucrats—face fines up to six percent of global revenue, which would amount to billions of dollars for many American companies. Furthermore, because many social media platforms generally maintain one set of content moderation policies that they apply globally, restrictive censorship laws like the DSA may set de facto global censorship standards.”

Europe in the Dark

Individuals outside of Europe could find themselves censored within Europe. This could happen to even a head of state or individual with enormous international reach. In the worst case, blocking content from reaching the 500 million inhabitants of the European Union has the potential to cut an entire continent out of the conversation—a draconian move with world-changing impact.

What is ADF International doing to challenge the DSA?

The DSA is irreconcilable with the human right to free speech. It must be repealed or substantially reformed to protect open discourse and fundamental freedoms in the EU and across the world. We cannot allow the DSA to become the global model for digital speech control.

ADF International is committed to challenging violations of free speech resulting from the DSA and building critical momentum to repeal or substantially reform this censorial framework. We are working to amend or strike down the parts of the DSA that undermine freedom of expression.

There is no disagreement that certain expression is illegal (e.g. child exploitation, incitement to terrorism) and every social media platform has a legal obligation to restrict this content. The DSA goes far beyond this. Instead, the DSA has created a censorship mega structure to ban “illegal content” without defining what “illegal content” is. Over time, this mega structure could censor speech that any person in any EU country considers “illegal” according to whatever law is either in force now or may be passed in the future. Behind the 100+ pages of complex legislation hides a blank cheque for censorship.

What can be done to challenge the DSA at the European level?

  • Equip Member States to initiate an action for annulment before the CJEU – Articles 277 and 263 of the Treaty on the Functioning of the EU (TFEU): Grounds to invoke include the lack of competence of the Commission, an infringement of the Treaties and the EU Charter (free speech), and a misuse of powers. This could result in having the DSA or parts of it declared “inapplicable”.

  • Mobilize Member States in the Council to repeal the DSA through a political decision: Repealing legislation once adopted is very difficult, and the procedure is similar to that for adopting the legislation. The Commission could initiate the repeal, but that appears politically unlikely. Instead, Member States in the Council can build a critical mass and take action.

  • Preliminary reference procedure before the CJEU – Article 267 TFEU: In the course of national litigation, any party or the judge, ex officio, can raise a question of EU law, particularly on its interpretation. Such questions could include the conformity of the DSA (e.g., the definition of illegal content under Article 3(h) and the obligation to act against illegal content under Article 9(2)(b)) with Article 11 of the EU Charter (freedom of expression and information). The decision to submit the reference to the CJEU rests entirely with the national judge, except for the situation when the case is at the court of the last instance, and the question of interpretation of EU law is necessary to decide the legal question at issue.

  • Engage in the DSA review process: According to Article 91 of the DSA, by 17 November 2025, the Commission shall evaluate and report to the European Parliament, the Council, and the European Economic and Social Committee. The scope of this first review is limited, and it will be followed by another review in 2027 and then every five years.

Why Christians Face Persecution in Egypt and How There’s Hope

Christian persecution in Egypt

Christians in Egypt are under threat in a country where 10% of the population is Christian. Our Global Religious Freedom Legal Counsel, Lizzie Francis Brink, tells the story.

Picture of Lizzie Francis Brink
Lizzie Francis Brink

Legal Counsel for Global Religious Freedom with ADF International

Religious freedom in Egypt along with Christian persecution.

Scripture reminds us that persecution is part of being a Christian, but few realize the scale of it today. Around the world, 1 in 7 Christians face harassment, violence, or worse simply for their faith. In Africa, this number rises to 1 in 5.

Among the persecuted are Egypt’s Christians, who live in a land of ancient wonders and rich history—yet face daily discrimination, harsh restrictions, and constant pressure to hide their faith. Despite Egypt’s status as a cultural and historical giant in Africa, it remains an ongoing struggle for many believers.

In March, I travelled to Egypt to meet with some of these brave Christians and their dedicated lawyers. Together with allied lawyers on the ground, ADF International is committed to ensuring that our brothers and sisters in Egypt are free to live and speak the truth.

Christians in Egypt

Egypt has a population of approximately 111.2 million people. Christians make up roughly 10% of this population, making them the largest Christian minority in the Middle East and North Africa (MENA) region.

Around 90% of Egyptians are Sunni Muslims, making Islam the dominant religion. According to Article 2 of Egypt’s Constitution, Islam is the official state religion, and the principles of Sharia Law serve as the primary source of legislation in the country.

While Article 64 of the Egyptian Constitution guarantees “absolute” freedom of religion or belief, this right is restricted in practice. Only followers of the three recognized “heavenly religions”—Islam, Christianity, and Judaism—are legally allowed to practice their faith and build houses of worship publicly.

Additionally, Article 53 of the Egyptian Constitution protects against discrimination based on religion.

But despite Egypt’s constitutional guarantees, Christians and other minority religious groups in Egypt regularly face religious freedom violations. Discriminatory practices and laws continue to restrict the ability of Christians, Shia Muslims, Ahmadis, and other non-Sunni or non-state-sanctioned Muslim groups, along with non-Muslim communities, to express and practice their beliefs freely.

The southern part of Egypt is particularly dangerous for Christians. It is more Islamically conservative than the north and heavily influenced by Islamic extremist groups. The Salafi al-Nour party, which operates legally despite constitutional bans, thrives in these underdeveloped regions, fuelling hostility toward Christians.

The Reality of Christian Persecution in Egypt

In Egypt, most persecution against Christians happens at the community level, especially in rural areas. Christian women are harassed, children are bullied in schools, employment discrimination is common, and false accusations of blasphemy often trigger violent mob attacks. These incidents force entire Christian families to flee their homes in fear.

In more extreme cases in the past, churches have been bombed, leaving dozens dead or injured. Christian women and girls have been targeted for abduction, forced conversion, and sexual violence.

While Egyptian President el-Sisi often calls for unity and support for Christians, progress on the ground remains slow. Local authorities often overlook attacks and rarely offer protection, particularly in the south. This means that challenging religious freedom injustices can also pose risks to personal safety.

Christians also face countless obstacles in building or repairing churches. Despite government promises to legalize more churches and build new ones, Christians struggle to find safe and legal places of worship. Hostile neighbours and violent mobs often stand in the way.

Converts from Islam face even greater persecution. Many are ostracized, disowned by their families, or pressured to return to Islam. State security forces monitor, intimidate, and sometimes detain converts, silencing any attempt to openly practice their Christian faith. Egyptian law makes it nearly impossible to officially change one’s religion from Islam.

Christians also face significant human rights challenges at the institutional level. Egypt’s blasphemy laws are often used to unjustly prosecute people for actions or statements deemed offensive to the dominant religion. Penalties range from hefty fines to prison sentences and, in extreme cases, death sentences.

Egypt’s International Promises and Rights Violations

Egypt has signed global treaties that promise to protect basic human rights. These include:

  • International Covenant on Civil and Political Rights (ICCPR)
  • International Covenant on Economic, Social, and Cultural Rights (ICESCR)
  • Convention Against Torture (CAT)
  • Convention on the Elimination of Discrimination Against Women (CEDAW)
  • Convention on the Rights of the Child (CRC)

However, the reality on the ground often results in Egypt falling short of these obligations, especially when it comes to Christians:

  • Violence against Christians goes unpunished (ICCPR Art. 2)
  • Christians are unfairly accused of blasphemy (ICCPR Arts. 18 & 19)
  • Children of Christian converts are forced to register as Muslim (ICCPR Art. 18, CRC Art. 14)
  • Christians face job discrimination because of their faith (ICCPR Art. 26)
  • Churches struggle to get building permits or legal status (ICCPR Arts. 21 & 26)
  • Christian women, especially in villages, face kidnapping and forced marriage to Muslim men (ICCPR Art. 23, CEDAW Art. 16, ICESCR Art. 10)

Abdulbaqi Abdo

ADF International has been supporting the legal defence of Abdulbaqi Saeed Abdo, a Yemeni-born Christian convert and victim of religious persecution in Egypt.

In 2021, Egyptian authorities arrested him for his involvement in a private Facebook group supporting Muslims who converted to Christianity. Officials falsely accused him of “joining a terrorist group with knowledge of its purposes” and “contempt of the Islamic religion.”

For over three years, Abdulbaqi was moved between several detention and terrorism centres, subjected to terrible conditions that impacted his health. He was also repeatedly denied private family visits and access to his legal team, cutting off his ability to receive basic supplies or confidential legal support.

In August last year, Abdulbaqi sent a heartbreaking letter to his family, announcing plans to begin a hunger strike due to the ongoing injustice. He said he would refuse medical care and eventually stop eating altogether.

After years of national and international advocacy, we helped secure his release by working closely with his lawyers and raising his case before the United Nations Working Group on Arbitrary Detention and other global religious freedom experts. We argued that Egypt violated his rights to religious freedom and a fair trial.

Thanks to this combined international effort, Abdulbaqi was freed in January this year and reunited with his family. He is now safe in another country, and we continue to support him as his case remains open.

Egypt’s Online Cybercrime and Blasphemy Laws

Under Egypt’s Cybercrime Law (175/2018), people have been investigated, arrested, and prosecuted simply for expressing their beliefs online.

Article 25 bans using technology to “violate family principles or values in Egyptian society.” This vague wording gives authorities sweeping power to censor online content and criminalize peaceful religious discussions or faith-based content on social media.

At the same time, Article 98(f) of Egypt’s Penal Code criminalizes “insulting the three heavenly religions”, though it is most often used against criticisms of Islam.

Despite international pressure to end violations of freedom of religion and speech, Egypt has shown no real effort to reform or repel these laws. Instead, religious minorities continue to be targeted.

The combination of laws designed to control terrorist activity and restrict free speech in Egypt was used to justify the arbitrary detention of Abdulbaqi for over three years.

Abdulbaqi’s case is just one of various examples of persecution faced by Christians in Egypt and across Africa.

My Trip to Egypt

ADF International has been actively involved in litigation and legal advocacy in Egypt for several years, defending religious freedom and supporting Christians facing persecution.

During my trip to Egypt, I met with the Christians we defend. I listened to their stories and saw firsthand how our alliance is standing alongside them as they fight for their right to live and worship freely.

I reconnected with our dedicated network of human rights lawyers—courageous defenders working in a country where laws are often misused to suppress religious expression. It was inspiring to strengthen these partnerships and establish new connections with Christian ministries supporting those who live out their faith.

I had the privilege of meeting Mr. Emad Felix, Abdulbaqi’s primary advocate. Mr. Felix supported the Abdo family from the start of Abdulbaqi’s detention in 2021. He has visited the court every few months since his incarceration to plead that he be released. His dedication exemplifies the vital role local lawyers play in defending religious freedom.

Our ability to engage in these critical cases would not be possible without such committed partners. We are grateful for the strong working relationship we have with lawyers in Egypt and remain committed to supporting them as they take on more cases to defend the fundamental Christian freedoms that are so often taken for granted in the West.

Lizzie Francis Brink in Egypt in front of a cross.

Conclusion: Religious Freedom is Severely Restricted in Egypt, but There is Hope

Egypt’s Christians live under constant pressure—from discriminatory laws, violent attacks, and systemic injustice. Despite constitutional promises and international treaties meant to protect religious freedom, the reality presents critical challenges.

Yet, in the face of such hardship, the courage and resilience of Egypt’s Christian community are a powerful testament to the enduring hope of the Gospel. During my trip, I witnessed that hope firsthand.

At ADF International, we are deeply committed to this fight. Our work in Egypt, alongside courageous local partners, is just one part of our broader mission to defend Christians wherever they are persecuted and to hold countries accountable to their human rights obligations.

Abdulbaqi’s release is a clear example that our efforts, combined with international pressure and prayer, can bring real change—and we’ve seen similar breakthroughs elsewhere in Africa and beyond.

While the road ahead is long, we remain steadfast. The Gospel continues to shine brightest where darkness tries its hardest to overcome it. Together, we will keep standing with Egypt’s Christians until the promise of true religious freedom becomes reality.

Australian tribunal to rule on whether using biologically accurate pronouns online is grounds for censorship 

  • CASE CONTINUES: Musk’s “X” and Canadian campaigner “Billboard Chris” challenge Australian “eSafety Commissioner” for censoring online post criticizing gender ideology 
  • Testifying, campaigner “Billboard Chris” tells Tribunal: “It’s damaging to teach children they are born in the wrong body 
  • Post referred to controversial WHO “expert” appointee Teddy Cook by her biologically accurate pronouns 

MELBOURNE (2 April 2025) – The Australian eSafety Commissioner has argued that a post using the biologically-accurate pronouns of a transgender activist was “likely …intended to have an effect of causing serious harm” and should therefore be subject to state-enforced censorship, before the Administrative Review Tribunal in Melbourne this week. 

The post in question, which was subject to a “removal notice” at the hands of the eSafety Commissioner in April 2024, shared a Daily Mail article headlined “Kinky secrets of UN trans expert REVEALED: Australian activist plugs bondage, bestiality, nudism, drugs, and tax-funded sex-change ops – so why is he writing health advice for the world body?” and which included pictures posted on social media by transgender activist, and WHO expert panel appointee, Teddy Cook.  

The takedown order is being legally challenged by Elon Musk’s platform “X”, and by the author of the post, Chris Elston, known as “Billboard Chris” online.  

ADF International and the Human Rights Law Alliance are supporting Elston’s legal case.  

“It’s damaging to teach children they are born in the wrong body…Children are beautiful just as they are. No drugs or scalpels needed.”

“I want everyone to think for themselves”

In February 2024, Canadian internet sensation and children’s safety campaigner “Billboard Chris” (Chris Elston), took to U.S. social media platform “X” to share the article, adding the comment: 

“This woman (yes, she’s female) is part of a panel of 20 ‘experts’ hired by the @WHO to draft their policy on caring for ‘trans people.” 

“People who belong in psychiatric wards are writing the guidelines for people who belong in psychiatric wards.” 

In his evidence this week, Elston told the Tribunal that while the first sentence of the tweet was a specific comment to the Daily Mail’s story on Teddy Cook, his second sentence was intended more broadly, to make a political comment about the ideological bias present amongst those in positions of power and influence when it comes to writing gender policy around the world. 

Speaking on the witness stand, Elston added: 

“It’s damaging to teach children they are born in the wrong body…children are beautiful just as they are. No drugs or scalpels needed.” 

Asked further about why he chose to post on this matter, Elston explained: “Because the World Health Organisation has global influence. We should have evidence-based care.” 

Under cross-examination, Elston responded, “My goal is not to provoke outrage. My goal is to simply try to educate people, and encourage discussion. I want everyone to think for themselves.” 

Freedom of political communication is protected as an implied right under the Australian Constitution. 

Defining “serious harm” to justify censorship

In accordance with Australia’s Online Safety Act 2021, the eSafety Commissioner seeks to prove that Chris Elston’s post constitutes “cyber abuse material directed at an Australian adult, including that it was likely that the material was intended to have an effect of causing serious harm”. 

Counsel for the eSafety Commissioner has suggested that Elston’s post could meet this threshold.  

Expert witness, consultant medical psychiatrist Dr. Jill Redden, testified that using biologically-accurate pronouns for somebody identifying as transgender could cause “irritation” and upset, but would not likely cross the statutory threshold to constitute “serious harm”. 

When asked how long one might expect to experience serious psychological symptoms of that severity, Dr. Redden answered “several months”. Elston’s counsel pointed out that Teddy Cook had professed on an Instagram post to be “living my best life” just nine days after the X post at the centre of this case was published. 

Media professor testifies that biologically accurate pronouns are “anti-science” 

The eSafety Commissioner called Professor Rob Cover of the Royal Melbourne Institute of Technology, a Professor of Digital Communication, as an expert witness. 

Professor Cover testified that he believes it is “harmful”, “offensive”, “untruthful”, “rude” and “anti-science” to use biologically accurate pronouns when referring to a person who identifies as transgender.  

He added that using sex-based language “adds to a kind of anti-trans rhetoric which is a common kind of misinformation…online and offline”. 

Cover considers his personal view to be “informed by science” and by “the truth of the person which wishes to be identified in that way in accordance with their reality.” 

Robert Clarke, Director of Advocacy for ADF International, which is backing Elston’s legal defence, said: 

“The decision of Australian authorities to prevent Australian citizens from hearing and evaluating information about gender ideology is a patronizing affront to the principles of democracy.  

“The confidence of the Australian eSafety commissioner to censor citizens of Canada on an American platform, shows the truly global nature of the free speech crisis. 

“Speaking up for free speech is critical at this juncture, and we’re proud to be backing Billboard Chris as he does just that.”  

Members of the public are invited to support Chris’s legal case here: https://adfinternational.org/campaign/support-billboard-chris   

Images for free use in print or online in relation to this story only

Pictured: “Billboard Chris” (Chris Elston); Chris Elston with the ADF International team supporting his case