Can YouTube decide to remove videos for disinformation?

In today's digital age, where social media platforms have become essential channels for public debate, a fundamental legal question arises: where is the boundary in Belgium between the freedom of speech of users and the right of online platforms to moderate content? A ruling by the French-speaking Brussels Enterprise Court sheds important light on this complex issue and offers crucial insights for content creators and platform operators alike.

The case of Kairos v. Google Ireland Limited: background and facts

On May 7, 2024, the French-speaking Business Court in Brussels ruled in a legal dispute between the asbl Kairos and Google Ireland Limited, the operator of the YouTube platform. The case revolved around a fundamental question: can an online platform remove content based on its own content guidelines, even when the user invokes freedom of speech?

Kairos, a non-profit association founded in 2013, has as its statutory purpose "defending the principles and values of a decent society and combating all forms of modern alienation." The organization publishes a bi-monthly journal entitled "Kairos" with the subtitle "journal anti productiviste pour une société décente" (anti-productivist journal for a decent society) and operates several media outlets, including a website (www.kairospresse.be) and, until December 2021, a YouTube channel.

The conflict arose when YouTube removed several videos from the Kairos channel between March and December 2021 for alleged violations of the "Policy Against Misleading Medical Information on COVID-19." The deleted videos included interviews with medical professionals and debates on COVID-19 policies that the platform said contradicted health information provided by official agencies.

Specifically, the following videos were removed:

  • March 23, 2021: An interview with Dr. Y.G.
  • April 17, 2021: An interview with Dr. C.B. titled "Les morts du Covid: un mensonge d'état?" (The COVID deaths: a state lie?)
  • June 18, 2021: A video of a debate on COVID-19 and its policies.
  • Oct. 25, 2021: A video titled "Le camp de concentration du Covid" (The concentration camp of COVID).
  • Nov. 18, 2021: A video of a speech by R.F.K. in Milan
  • Dec. 13, 2021: An interview with P.J.
  • December 15, 2021: A press conference entitled "Non à l'obligation vaccinale covid" (No to the COVID vaccination obligation).

After repeated violations, YouTube decided to remove Kairos' entire channel on Dec. 13, 2021. Kairos put Google in default and requested reinstatement of the channel, but Google upheld its decision. Kairos therefore decided to file a Initiate legal action against Google.

Legal grounds of the dispute

In its claim, Kairos relied on several legal grounds:

  1. Violation of freedom of expression: Kairos argued that the removal of its content constituted an unlawful interference with its right to freedom of speech as protected by Article 10 of the European Convention on Human Rights (ECHR). and Article 19 of the Belgian Constitution.
  2. Violation of the P2B regulation: In the alternative, Kairos argued that Google's Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 promoting fairness and transparency for business users of online brokering services (the P2B regulation) had violated.
  3. Unlawful contractual provisions: In the extreme subsidiary order, Kairos claimed that YouTube's COVID-19 policy was unlawful in light of the tortious clauses provisions of Article VI.91/3 et seq. of the Code of Economic Law.

Google defended its decision by arguing that the removals were consistent with the contractual terms governing the relationship between the parties, and that these terms stemmed from its property rights and its freedom of enterprise.

The court's analysis: key points of the judgment

The court dealt with the case thoroughly and came to a number of important conclusions of broader relevance to the realm of digital free speech:

1. The legal context: horizontal effect of fundamental rights

The court began by noting that no specific legislative framework existed for this type of conflict at the time of the dispute. The Digital Services Act (Regulation (EU) 2022/2065). was not yet applicable at the time.

Important was the court's recognition that, as a judicial body representing the Belgian state, it has the responsibility to indirectly watch over the safeguarding of the exercise of fundamental rights. This implies a certain horizontal effect of fundamental rights in the relationship between individuals.

The court nuanced Google's position, which based on the ruling Appleby and Others v. United Kingdom of the ECHR (May 6, 2003) denied any horizontal effect of the ECHR. The court noted that this ruling predated the emergence of digital media and that a rigid interpretation of Article 10 ECHR is incompatible with the evolution of those digital media.

The court recognized that Google is a major player and that the effective exercise of freedom of expression through it could be compromised, which required investigation.

2. The limits of freedom of expression

The court emphasized that the right to freedom of speech, as contained in Article 10 ECHR, is not absolute. This article itself provides that the exercise of this freedom entails "duties and responsibilities" and may be subject to restrictions necessary in a democratic society for, among other things, the protection of public health and the rights of others.

In this context, the court identified two crucial aspects that can justify restrictions on freedom of expression:

  • The rights of others (in this case, Google's property rights and entrepreneurial freedom)
  • The protection of public health (in the context of the COVID-19 pandemic)

3. The contractual relationship and tacit acceptance of terms and conditions

A key discussion point was whether Kairos had accepted the COVID-19 policy, as its YouTube channel was established before the pandemic and the introduction of the specific rules around COVID-19 information.

The court rejected Kairos' argument that it had not agreed to the rules. The court stated that Kairos, as an active journalism organization, could not claim that it was unaware of the terms, as it had accepted the general terms of use when accessing the platform's services and continued to use YouTube despite criticism of the policy

The court also noted that YouTube already had policies regarding dangerous or harmful content before the health crisis, and that the specific COVID-19 policies were a logical and reasonable extension in the context of a pandemic.

4. Alternative communication channels.

A crucial element in the court's assessment was the existence of alternative communication channels for Kairos. The court stated:

"Freedom of expression does not mean that a user can impose his choice of means of communication, even on a major media player, when the user has other means of communication available."

The court noted that Kairos still had other channels to spread its message, including its own website, other social networks and written press. This was critical in determining that Kairos' rights under Article 10 ECHR were not materially affected.

5. Legitimacy and proportionality of restrictions.

The court ruled that the restrictions imposed by YouTube were legitimate and proportional in the specific context of the pandemic and Belgian public health law:

"The legitimacy of the removal, motivated by a precautionary choice of prudence and this in turn prompted by official recommendations, does not appear to be in question in the context of a pandemic and the provisions of Belgian national law relating to public health."

The court concluded that the proportionality criterion had been respected as the restriction met a "compelling social need" and was reinforced by a "social obligation to follow the recommendations of official bodies."

6. P2B regulation not applicable

Regarding Kairos' subsidiary argument regarding the P2B Regulation, the court found that Kairos had not shown that it was a "business user" within the meaning of the Regulation. The regulation applies only to businesses that offer goods or services to consumers through online brokering services for purposes related to their commercial, industrial, craft or professional activities.

The court noted that Kairos had not monetized its YouTube channel and had not shown that it met the legal criteria, particularly regarding "offering goods or services to consumers" with a commercial activity.

7. No abusive clauses

Finally, the court examined whether the COVID-19 policy constituted an unlawful term within the meaning of Article VI.91/3 of the Code of Economic Law. The court found that there was no apparent imbalance between the parties' rights and obligations, as:

  • Kairos had agreed to the terms and was aware of the rules
  • Kairos was not economically dependent on YouTube, as its presence on Facebook was much larger and it also used the Odyssey platform
  • The adjustment of conditions in the context of the health crisis was justified

Broader implications for digital freedom of expression in Belgium

This ruling has important implications for how we deal with freedom of expression in the digital environment:

1. Legal relationship between users and platforms

The ruling confirms that the relationship between users and platforms is primarily contractual in nature. Users accept the terms of use and community guidelines, which give the platform the right to moderate and remove content if necessary.

This means users should be aware of the terms and conditions they accept when using online services. The court acknowledged that platforms can update their terms and conditions to respond to new challenges, such as a pandemic, as long as these updates are reasonable and related to the original purposes of the terms and conditions.

2. The balance between property rights and freedom of expression

The judgment recognizes the tension between the property rights of platforms (Article 1 of the First Protocol to the ECHR) and the freedom of expression of users (Article 10 ECHR). This tension must be resolved through a careful balancing of the interests involved.

It is important to note that the court views both rights as fundamental and seeks to strike a balance. In doing so, the court relies on the case law of Amsterdam District Court (Oct. 13, 2020), which held that a platform's property right can be a legitimate restriction on the freedom of expression of others, based on Article 10(2) ECHR.

3. The importance of alternative channels

A crucial element in the court's assessment was the existence of alternative communication channels. This suggests that the more a platform has a monopoly on certain forms of communication, the more careful it should be with content moderation.

For users, this means that it is wise not to run all your communications through one platform, but to build a presence on several platforms to be less vulnerable to moderation decisions.

4. Public health as a legitimate constraint

The court recognized that the protection of public health, especially in a pandemic context, is a legitimate reason for imposing restrictions on freedom of expression. This is consistent with Article 10(2) ECHR, which explicitly lists "the protection of health" as a legitimate purpose for restrictions.

This means that content potentially harmful to public health may be subject to stricter moderation, especially when it goes against the recommendations of official agencies during a health crisis.

5. The role of official guidelines

The ruling highlights the importance of official guidelines in determining the boundaries of acceptable content. The court acknowledged that YouTube could rely on the recommendations of official bodies, such as the World Health Organization and national health authorities, in its moderation decisions.

This suggests that platforms have some margin to moderate content that runs counter to widely held scientific or official views, especially in matters of public health.

Practical lessons for social media users and content creators

Based on this verdict, we can draw some practical lessons for individuals and organizations operating on social media platforms:

1. Know the terms you accept

It is essential to read and understand platforms' terms of use and guidelines before using them. These form the contractual basis of your relationship with the platform and determine what you can and cannot do.

Be aware that platforms may update these terms and that continued use of the platform after an update usually constitutes tacit acceptance of the new terms.

2. Diversify your online presence

To avoid depending on one platform, it is wise to spread your presence across different channels. This reduces the risk of your voice being completely stifled if one platform removes your content.

In addition to social media, consider investing in your own channels, such as a website or newsletter, where you have more control over your content.

3. Recognize that sensitive topics require extra attention

Content on controversial or sensitive topics, such as public health, may be subject to stricter moderation. If you publish on such topics, be sure to provide careful justification and be aware of the limits that platforms can set.

4. Document your content

It may be wise to keep copies of your content so you don't lose it if a platform decides to remove it. This can also be useful if you want to challenge the removal.

5. Make use of appeal procedures

Most platforms offer appeal procedures if your content is removed. Take advantage of this if you think your content was removed unfairly. When doing so, make sure your arguments are clear and respectful.

6. Consider legal advice in serious cases

If you believe that a platform has seriously violated your right to freedom of expression, it may be advisable to seek legal advice. As this ruling shows, the legal landscape is complex and evolving, and a specialized lawyer can help you assess your position.

The future of digital freedom of expression

The legal landscape surrounding digital freedom of expression is constantly evolving. This ruling provides a snapshot in time, but there are significant developments that will affect the future of this matter:

1. The Digital Services Act (DSA).

Since the facts of this case, the Digital Services Act (Regulation (EU) 2022/2065) has come into force. This regulation sets new rules for content moderation by online platforms and strengthens users' rights. It is likely that future content moderation disputes will be assessed in light of this new framework.

2. Developing case law.

The European Court of Human Rights and national courts continue to develop new case law on the application of fundamental rights in the digital environment. This case law will further refine the boundaries of digital freedom of expression.

3. New technological developments

The rise of new technologies, such as artificial intelligence and decentralized platforms, will raise new questions about the limits of freedom of expression and the responsibilities of platforms.

Conclusion

The ruling by the French-speaking Business Court in Brussels in the case of Kairos v. Google Ireland Limited illustrates the complex legal questions raised by the clash between users' freedom of expression and the right of online platforms to moderate content.

The court acknowledged that freedom of expression is a fundamental right, but stressed that this right is not absolute and must be balanced against other rights and interests, such as the property rights of platforms and the protection of public health.

The ruling confirms that the relationship between users and platforms is primarily contractual in nature, and that users must be aware of the terms they accept. At the same time, the ruling recognizes the special responsibility of major platforms, given their important role in public discourse.

It is essential for social media users and content creators to understand the terms of platforms, diversify their online presence and be aware of the boundaries that platforms can set, especially around sensitive topics.

The legal landscape surrounding digital free speech continues to evolve, and future rulings will no doubt shed more light on the proper balance between the various rights and interests involved.

Joris Deene

Attorney-partner at Everest Attorneys

Contact

Questions? Need advice?
Contact Attorney Joris Deene.

Phone: 09/280.20.68
E-mail: joris.deene@everest-law.be

Topics