1. Introduction and Background
The digital world has developed rapidly since the introduction of the E-commerce Directive in 2000. Meanwhile, online platforms, social media and digital marketplaces have become an integral part of EU citizens' daily lives. Platforms like LinkedIn, WhatsApp, Instagram and online stores like Bol.com have become intertwined with our personal and professional lives. These online service providers greatly influence what we see, read and think. At the same time, this digitalization brings new risks, such as the spread of illegal content, scams and disinformation.
The e-Commerce Directive has long been the basis for the regulation of online services within the EU. This directive contained provisions on the liability of intermediaries and was transposed by EU member states into national legislation (as in Belgium in the WER). However, this led to fragmentation, with different member states drawing up their own rules for dealing with illegal online content.
In response to these challenges, in 2022 the European Union adopted the Digital Services Act (DSA) or digital services regulation enacted. The DSA stems from the need for:
- More clarity on rules for online intermediaries
- Better protection of users from illegal content and activities
- Regulatory harmonization within the EU to avoid fragmentation
- A balance between countering illegal content and safeguarding fundamental rights
Unlike the e-Commerce Directive, the DSA is a regulation directly applicable in all EU member states, without transposition into national (such as Belgian) law. This promotes consistent application and reduces fragmentation of rules within the internal market.
The DSA introduces a set of rules that providers of online services must abide by, while also eliminating their liability if they comply with all of them. The DSA went into effect on Feb. 17, 2024.
Below we provide a comprehensive overview of the DSA, its scope, the obligations for different providers and the monitoring and enforcement mechanism to ensure effective compliance with the regulation, including in Belgium.
Table of contents
- 1. Introduction and Background
- 2. Scope
- 3. Liability regime
- 4. Obligations for all brokering service providers.
- 5. Obligations for hosting service providers.
- 6. Obligations for online platform providers
- 7. Obligations for online marketplaces
- 8. Exceptions for micro and small businesses
- 9. Supervision and enforcement
- 10. How can our law firm assist you?
2. Scope
The DSA applies to brokering services offered to users (physical or legal persons) in the EU, regardless of where the provider is based. Thus, providers based outside the EU may also be covered by the regulation.
The DSA distinguishes between four categories of brokering service providers, with an asymmetric set of obligations: the larger the service and the higher the risk, the more obligations.
2.1. Mere conduit services (purely pass-through services).
Services that merely transmit information over a communications network or provide access to such a network.
Examples: Internet service providers (Proximus, Telenet), DNS services, Wi-Fi networks, Internet exchange points.
2.2 Caching services
Services in which information is temporarily stored to facilitate later transmission.
Examples: web browsers that temporarily store websites, DNS servers (for temporarily storing web pages), CMS (WordPress).
Such services are actually never offered independently but usually in conjunction with mere conduit or hosting services.
2.3 Hosting services
Services in which information is stored at the user's request.
Examples: cloud services (Microsoft OneDrive? Dropbox), web hosting companies (Combell, Easyhost).
Within this category are online platforms:
- Social media networks (Facebook, Instagram)
- Marketplaces (Bol.com, 2dehands.be)
- App stores
- Platforms for the sharing economy
2.4 Very large online platforms (VLOPs) and very large online search engines (VLOSEs)
Platforms (subcategory of hosting services) and search engines with more than 45 million monthly active users in the EU.
Examples: Alphabet (Google), Meta (Facebook, Instagram), Amazon, Zalando.
The strictest rules apply to this category because of their systemic impact on the economy and society.
They are referred to by the European Commission.
3. Liability regime
The premise of the DSA is that brokering service providers are not liable for illegal content that users distribute through their services, provided they meet certain conditions. After all, the European legislator wants to avoid hampering the growth of online services.
3.1. Conditions by service type
For mere conduit services:
- The provider does not initiate the retransmission itself
- The provider does not select the recipient of the retransmission
- The provider does not select or change the transmitted information
For caching services:
- The provider does not change the information
- The provider observes the access conditions for the information
- Provider follows common rules for updating information
- Provider does not interfere with legitimate use of technology to obtain usage data
- Provider acts promptly in removing information when it has been removed from the original source
For hosting services:
- The provider has no actual knowledge of the illegal activity or content
- Provider acts promptly when it learns of illegal content to remove it
3.2. Limitations of the liability exemption.
The exemption from liability does not apply in the following cases:
- When the user is acting under the authority or supervision of the provider
- For online marketplaces: when the information is presented in such a way that consumers think they are buying directly from the platform rather than from a third party. This is not the case when it is clearly stated that the online sale is made by an independent seller.
- When the provider has actual knowledge of illegal content and does not act promptly
Importantly, the DSA does not impose a general obligation on brokering service providers to monitor the information they transmit or store nor to actively investigate the facts or circumstances indicating illegal activity.
4. Obligations for all brokering service providers.
All brokering service providers, regardless of size or type, must comply with the following basic obligations:
4.1. Points of contact and representation
- Designate a contact point for communication with authorities (such as the European Commission, the EBDS, and (for Belgium) the BIPT)
- Provide a point of contact for service users
- For providers outside the EU: appoint a legal representative in one of the member states where services are offered
4.2. Transparency and communication
- If providers can limit the content users can post, this should be clearly stated in their terms and conditions
- Provide information on policies, content moderation, algorithmic decision making, procedures, etc.
- Ensuring that this information is understandable to minors if they are the target audience
4.3. Annual reporting
- Publish an annual report on content moderation applied over the past year, among other things (not required for micro and small businesses)
- This report should provide insight into measures taken to counter illegal content
- The report from hosting service providers and online platforms should be more comprehensive and submitted semi-annually
4.4. Responding to orders from authorities
- Promptly respond to orders from judicial or administrative authorities to act against illegal content
- Inform the relevant authority of the measures taken
- Comply with orders to provide information about specific users
- Informing users when information about them has been shared
5. Obligations for hosting service providers.
In addition to the general obligations, hosting service providers must meet additional requirements:
5.1. Notice-and-action mechanism
- Implement a mechanism for users to report illegal content to the provider
- Ensuring that these reports can be easily submitted and processed
5.2. Information provision in the event of limitation
- When removing content or restricting a user, the provider must provide clear and specific justification to the user
- Ensuring that this rationale meets the substantive requirements of the DSA
5.3. Reporting of offenses
- Inform authorities upon knowledge of information indicating a serious criminal offense threatening life or safety of persons
- Share all relevant information with appropriate authorities
6. Obligations for online platform providers
Online platforms are a subset of hosting services such as online marketplaces, app stores, social media platforms, sharing economy platforms, etc. They have to meet even more obligations. Micro and small businesses are subject to fewer obligations .
6.1. Internal complaint handling and out-of-court dispute resolution
- Implement an internal complaint system for users who are restricted from using the service. The provider must respond "promptly" to the complaint.
- Allow users to choose an out-of-court dispute resolution body for disputes certified by the national regulator (BIPT in Belgium).
6.2. Trusted flaggers
- Treat reports from 'trusted flaggers' with priority
- Reliable flaggers are organizations that have demonstrated that they address illegal content in an objective manner
- Informing the DSC when trusted flaggers repeatedly submit false reports
6.3. Measures against illegal content
- Temporarily suspend users after warning who repeatedly post illegal content (e.g. terrorism/hatespeech/fraud)
- Limit users who regularly file unfounded complaints through the notice & action system
6.4. Transparency about advertising
- Providers showing advertising must make it clear that it is advertising
- Inform on whose behalf the advertising is shown and who paid for it
- Provide insight into key parameters for showing ads
- Using special personal data to display advertising based on profiling advertising is no longer allowed
6.5. Recommendation systems
- Explain the main parameters of recommendation systems in the general conditions
- Such systems display certain content based on "relevance"
- Provide users with options to influence or adjust these parameters (e.g., Linkedin )
6.6. Protection of minors
- Providers must take appropriate and proportionate measures to ensure the safety of minors
- Providers may no longer show advertising to minors based on profiling using their personal data.
- Imact on providers such as Tiktok
6.7. Neutral user interface
- Not designing the interface in a way that misleads or manipulates users
- Ensuring that users can make free and informed decisions
7. Obligations for online marketplaces
Online platforms that facilitate sales between merchants and consumers must comply with additional obligations (not for micro and small businesses unless they are classified as VLOP or VLOSE)...:
7.1. Verification of traders
- Collect basic information about traders operating on the platform (e.g., Vinted)
- Verify this information before allowing merchants to offer products or services
- Suspend access if traders fail to provide required information within 12 months
7.2. Compliance by design
- Designing the platform so that traders can easily meet their legal obligations
- Enable traders to provide pre-contractual, product safety and conformity information
7.3. Illegal products and services
- Inform consumers when they are found to have purchased illegal products or services
- If individual buyers cannot be identified: post a general notice on the platform
8. Exceptions for micro and small businesses
To reduce the burden on smaller businesses, there are exceptions for micro and small businesses:
- Microenterprises: fewer than 10 employees and annual sales of up to 2 million euros
- Small businesses: fewer than 50 employees and annual sales of up to 10 million euros
These companies are exempt from:
- Most obligations for online platforms, except reporting on number of users if requested
- The specific obligations for online marketplaces (except if it is a VLOP or VLOSE)
- The annual reporting on content moderation
9. Supervision and enforcement
The DSA introduces a layered system of supervision and enforcement, with different responsible bodies at the European and national levels.
9.1. European Commission
The European Commission has exclusive powers to supervise very large online platforms (VLOPs) and very large online search engines (VLOSEs). The Commission has the following powers:
- Doing your own research on VLOPs and VLOSEs
- Requesting information from these platforms
- Hearing persons and taking statements (by consent)
- Conducting inspections
- Taking provisional measures
- Making platform commitments binding
- Monitoring databases and algorithms
- Make decisions on non-compliance and impose fines (up to 6% of global annual sales)
- Impose penalty payments (up to 5% of average daily turnover)
In extreme cases, the Commission may initiate proceedings to prohibit or restrict access to a service, but only for serious violations that constitute a criminal offense or threaten the life or safety of individuals.
9.2. Digital Services Coordinator (DSC).
Each Member State must designate a Digital Services Coordinator (DSC) to implement and enforce the DSA within that Member State. The DSC has the following duties and powers:
- Coordination at the national level and cooperation with other DSCs, the Commission and the European Board for Digital Services (EBDS)
- Investigative powers, including requisitioning information, conducting inspections and seeking explanations from employees
- Enforcement powers, such as accepting commitments, issuing orders to stop violations, imposing fines and periodic penalty payments
- Emergency powers, such as requiring management action or requesting access restrictions through judicial authorities
The competent Member State for supervision is the Member State where the provider has its principal place of business. For providers outside the EU, this is the member state where the legal representative is located.
The list of all DSCs will be published by the European Commission.
9.3. European Board for Digital Services (EBDS).
The EBDS is an independent advisory group consisting of the DSCs of all member states. The tasks of this council are:
- Advise DSCs and the Commission on consistent application of the DSA
- Contributing to effective cooperation among national regulators
- Provide guidance on the interpretation of emerging issues related to the DSA
- Provide support in coordinating joint investigations
9.4. Out-of-court dispute resolution
The DSA provides for certification of out-of-court dispute resolution bodies by the DSCs. These bodies offer users of online platforms an opportunity to resolve content moderation disputes out of court. They must:
- Being independent of platforms and users
- Have the necessary expertise in content moderation
- Deciding on submitted complaints within 90 days
- Annual reporting to the DSC
9.5. Trusted flaggers
DSCs can designate organizations as trusted flaggers if they have demonstrated expertise in identifying illegal content. Reports from trusted flaggers should be treated with priority by online platforms. These may include:
- Government organizations such as Europol
- Non-governmental organizations
- Semi-governmental organizations such as hotlines for online child abuse
List of reliable flaggers becomes published by the European Commission
9.6. User rights
Users harmed by a breach of the DSA have the right to hold the provider liable and seek damages under applicable law. This provides opportunities for possible class actions.
9.7 National authorities in Belgium
In the Belgian constitutional system, the Communities are responsible for cultural matters, including audiovisual media services. This includes video platforms under the Audiovisual Media Service Directiven fall. In contrast, the federal government, through its residuary authority, remains responsible for all other types of brokering services, such as hosting services, search engines and online marketplaces.
The Belgian Institute for Postal Services and Telecommunications (BIPT). has been designated as the federal regulator as well as the digital services coordinator (DSC) for Belgium. In the latter capacity, BIPT represents Belgium in the EBDS.
For video platforms, the communities' respective media regulators have jurisdiction:
- The Flemish Media Regulator (VRM). for the Flemish Community
- The Conseil Supérieur de l'Audiovisuel (CSA). for the French Community
- The Medienrat For the German-speaking Community
To make this complex structure workable, it was on May 3, 2024 a cooperation agreement concluded between the federal government and the three communities. This agreement:
- Confirms BIPT as Belgian coordinator for digital services
- Regulates information sharing between different authorities
- Organizes the Belgian position statement in European forums
- Creates a mechanism for centralized complaint handling
The cooperation agreement also provides for a system of concerted representation in the EBDS. In the absence of consensus among the competent authorities, the Belgian position becomes an abstention.
The various Belgian authorities have different sanction options:
- Warn
- Temporarily stop broadcasts/programs/videos
- Impose fines of up to 6% of global annual sales
- Impose penalties of up to 5% of average daily global sales
10. How can our law firm assist you?
As a specialized law firm in digital law, we are excellently positioned to guide you through the implementation of and compliance with the Digital Services Act. Our expertise covers all aspects of this complex regulation, both at the European and Belgian levels.
We can support you with:
- A thorough analysis of the applicability of the DSA to your services
- Drafting or updating general terms and conditions, privacy policies and content moderation procedures
- Setting up mandatory reporting systems for illegal content
- Advice on meeting transparency obligations around advertising and recommendation systems
- Guidance on interactions with regulators, including BIPT, CSA, VRM or Medienrat
- Defense in investigations or enforcement proceedings
- Training your employees on obligations under the DSA
