Should online marketplaces like Amazon follow stricter rules under the Digital Services Act?

Amazon recently challenged its designation as a ‘very large online platform’ (VLOP) before the General Court of the European Union. The key question was whether an online marketplace may be subject to the most stringent obligations of the Digital Services Act (DSA). The Court ruled on 19 November 2025 that this is justified: consumer protection and systemic risk management outweigh the economic freedom of the platform in this case.

The legal context and facts

The Digital Services Act (DSA), or digital services regulation, aims to create a safer online environment. A crucial part of this legislation is the classification of “very large online platforms” (VLOPs). These are platforms with more than 45 million monthly active users in the EU. Because of their huge reach, they must comply with extra strict rules to reduce societal risks.

The European Commission designated the Amazon Store platform as a VLOP. Amazon disagreed and went to the General Court of the European Union (Case T-367/23). Amazon argued, among other things, that:

  • Online marketplaces do not pose “systemic risks” as social media do (e.g., disinformation).
  • The obligations disproportionately infringe on their freedom to conduct a business and right to property.
  • Mandatory disclosure of an advertising register violates privacy and trade secrets.
  • They are treated unequally compared to smaller marketplaces or pure online stores.

The court's decision

The General Court of the European Union rejected all of Amazon's arguments and upheld the Commission's decision in its ruling of Nov. 19, 2025.

The Court held that the legislature (the EU) has broad discretion. The key points of the decision are:

  1. Marketplaces do pose systemic risks: The Court emphasized that risks are not limited to disinformation. The distribution of illegal goods (such as unsafe products or counterfeits) through a platform with a reach of more than 45 million users does represent a systemic risk to public health and consumer safety.
  2. Infringement of entrepreneurship is justified: Although the DSA rules cost money and require technical adjustments, the public interest (consumer protection) outweighs them.
  3. Advertising transparency: The requirement to keep a public record of ads is necessary to monitor what is shown to consumers (and particularly minors). This outweighs Amazon's privacy concerns.
  4. Equal treatment: The distinction based on user numbers (the 45 million limit) is objective and not arbitrary. Large platforms simply have a greater impact on society.

Legal analysis and interpretation

This ruling is a textbook example of balancing fundamental rights in modern digital law. The Court had to balance the freedom to conduct business (Article 16 of the Charter) with the right to consumer protection (Article 38 of the Charter).

Legally, this ruling confirms that the Union legislature has broad discretion in complex, technical regulations such as the DSA. The Court merely tests whether the measure is “manifestly inappropriate”.

The analysis on profiling is interesting. Amazon opposed the requirement to offer at least one option for each recommendation system (the algorithm that determines what you see) that is not based on profiling. Amazon argued that this makes their algorithms less effective and hurts sales. However, the General Court put user autonomy at the center: consumers should have the choice not to be tracked. This is seen as strengthening the consumer's information capabilities, which directly contributes to reducing the platform's dominance over the user's choices.

In addition, the Court rejects the contention that marketplaces are fundamentally different from social media in the context of the DSA. The term “systemic risk” in Article 34 DSA is interpreted broadly: it is not just about democratic debate, but equally about the physical safety of consumers buying products.

What this specifically means

This ruling has direct implications for various actors in the digital landscape:

  • For online platforms and marketplaces: The threshold of 45 million users is a hard limit. Once it is crossed, onerous compliance obligations kick in. You must invest in transparency (advertising records), audit capabilities for researchers and customizable algorithms. The argument “we are just a conduit” or “we are not social media” does not hold up at this size.
  • For consumers: You will have more control. With large platforms like Amazon, you will have to be able to get products recommended without it being based on your personal browsing habits (profiling). You will also get more insight into who is paying for the ads you see.
  • For advertisers and traders: Be aware that data about your ads on VLOPs (such as Amazon) will end up in a public record. This includes the content of the ad, the time period and the target audience, but not personal data or trade secrets about the success of the campaign.

Frequently asked questions (FAQ)

What is a ‘very large online platform’ (VLOP) under the DSA?
A VLOP is an online platform that has an average of more than 45 million monthly active users in the EU. Because of this large reach, they must comply with extra strict rules to manage social risks.

Why does Amazon need to adjust its algorithms?
The DSA requires very large platforms to offer users a choice. There must be at least one option for the recommendation system (e.g., search results) that is not based on profiling (tracking your behavior), giving the user more autonomy.

Isn't disclosing ad data a violation of privacy?
The General Court ruled that it did not. The advertising registry must provide transparency about who is advertising what to protect consumers. However, the register must not contain users' personal data and does not reveal sensitive trade secrets such as exact sales figures.

Conclusion

With this ruling, the General Court of the European Union confirms the teeth of the Digital Services Act. Big players like Amazon cannot evade responsibility by pointing to their business model as a mere marketplace. The size of the platform automatically brings with it the responsibility to actively combat systemic risks - from illegal products to non-transparent algorithms. Consumer protection here outweighs maximum commercial freedom.


Joris Deene

Attorney-partner at Everest Attorneys

Contact

Questions? Need advice?
Contact Attorney Joris Deene.

Phone: 09/280.20.68
E-mail: joris.deene@everest-law.be

Topics