The Digital Services Act ("DSA") represents a true (r)evolution for the online regulation of intermediary services within the European Union. The importance of the DSA can hardly be exaggerated as every hosting, caching or mere conduit service in the EU will be severely impacted by the new framework (see examples of the intermediary services in the table below, ranging from companies providing WiFi hotspot services to guests to Very Large Online Platforms). Online advertisers will indirectly be affected too as they will be caught by transparency obligations on online platforms.
In order to counter the fragmented legislation on intermediary services of the EU Member States, which adversely impacted national and cross-border trade in the EU, the DSA sets out a unified set of rules regulating due diligence obligations, liabilities, accountability and supervision by/of the online service providers. At the same time, by breaking the borders, the DSA fosters commercial innovation and growth across the EU.
Most of the obligations under the DSA will become applicable on 17 February 2024, except for a.o. the obligation for online platforms and search engines to publish their average monthly users which applies from 17 February 2023 (see below). Notwithstanding, companies should start preparing today and adopt a multidisciplinary approach (coordinate with legal, IT, operations, etc.).
The due diligence obligations imposed on intermediary services are layered and based on the type, size and nature of the online service. The due diligence obligations for intermediary services include, inter alia: (*) (**)
- All intermediary services, i.e., mere conduit, caching, and hosting services (the latter includes online platforms) e.g.: mere conduit services: Wi-Fi Hotspot, VPN (connection to company network), VoIP services, caching services: content delivery networks used by Internet Access Providers and local networks (e.g. companies, universities), hosting services/online platforms, see below.
- (Publicly) designate a single point of contact to enable communication with Member States and users of the service
- Information on restrictions in relation to the (business) use of the service (in the terms and conditions), including policies, tools for content moderation (incl. algorithmic decision-making). These restrictions could/should entail restrictions that go beyond the prohibitions by applicable law, e.g., restrictions on nudity, disinformation, extreme political views.
- Information on changes to the terms and conditions.
- Annual reports on content moderation on illegal content, including orders from Member States, complaints, and own initiatives. This will have a significant impact on the visibility/ability of Member States, consumers, or other business users to tackle illegal content (e.g., counterfeit goods, infringement of copyright).
2. Hosting services (including online platforms) e.g., cloud computing services (IaaS, Paas), webhosting services
- Notice and action tools for users to flag illegal content which can trigger liability of the hosting service.
- Reasons for imposing restrictions on (business) users for illegal content/products, restrictions on payments, termination of the account. These reasons can be challenged by the (business) user.
- Notification of suspicions of criminal offences involving a threat to the life or safety of a person.
3. Online platforms - e.g.: SaaS, social networks, app stores, travel, and accommodation websites
- By 17 February 2023 (and then every six months) publish in a publicly available section, information on the average monthly active users of the past 6 months (incl. mere visitors).
- Set-up an effective internal complaint-handling system. Facilitate out-of-court dispute settlement for disputes related to the internal complaint-handling system.
- Suspension, after a warning, of access to (business) users frequently providing illegal content or (business) users frequently filing unfounded complaints.
- Additional reporting duties on disputes and suspensions.
- Prohibition to design, organize or operate websites and apps to deceive or manipulate the users (so-called dark patterns).
- Transparency on advertisements, i.e., qualification, identification of entities responsible/paying, parameters on the targeted subjects. Prohibition on targeted advertisement to minors or based on special categories of personal data (e.g., religion, sexual orientation). This will have an indirect impact on advertisers using the online platforms.
4. Online platforms allowing consumers to conclude online contracts - e.g., online marketplaces such as bol.com; Coolblue; e-bay
- Identification, assessment and keeping a register of companies providing products or services via the online platform (tracing), i.e., know your business customer.
- Enable companies providing products or services via the online platform to comply with pre-contractual, compliance and product safety obligations and assessment thereof, i.e., compliance by design.
- Inform consumers who purchased an illegal product or service.
5. Very large online platforms / online search engines (45 million users) - e.g.: Facebook, Amazon; Google, Microsoft
- Provide a concise, easily accessible, and machine-readable summary of the terms and condition.
- Terms and conditions in the official languages of all the Member States of activity.
- Appoint a compliance officer and perform an annual independent external compliance audit.
- Make publicly available an advertisement register with information on content, entities responsible, parameters on the targeted subjects. This obligation will indirectly affect advertisers using very large online platforms / online search engines.
- Additional obligations to manage systemic risks regarding the:
- dissemination of illegal content;
- impact of the service on the exercise of fundamental rights;
- foreseeable negative effects on democratic processes, civic discourse, and electoral processes, as well as public security;
- foreseeable negative effect on the protection of public health, minors, and serious negative consequences to a person's physical and mental well-being, or on gender-based violence.
(*) Some of the due diligence obligations mentioned are not applicable to micro or small enterprises, unless they qualify as very large online platforms.
(**) The due diligence obligations for very large online platforms or search engines will apply 4 months after they have been informed of their designation as such or on 17 February 2024, whichever is earlier.
FINES: Failure to comply with the due diligence obligations could lead to maximum fines imposed by the Member States of 6 % of the annual worldwide turnover in the preceding financial year.
The DSA furthermore contains liability exemptions for intermediary services insofar as certain conditions are met, e.g. promptly removing content when the illegality is clear. There is also no obligation for intermediary services to actively monitor information stored or transmitted, even though a lot of online platforms perform (algorithmic) monitoring.
Finally, the DSA complements the Digital Markets Act, which regulates so-called Core Platform Services of designated Gatekeepers (e.g. (e.g. online marketplaces, sharing platforms, cloud computing services of Facebook, Apple, Google, Amazon, Microsoft), which will have a significant impact on the level playing field for (business) users of the Core Platform Services. Please find a blogpost on the Digital Markets Act here.
In summary and as of today, companies that qualify as intermediary service providers need to prepare for the upcoming due diligence obligations, including updating their systems, policies and information processes in order to be ready for the deadline of 17 February 2024. Additionally, the qualifying companies should already have taken the necessary steps to provide or publish on their website the average monthly active users, an obligation which entered into force on 17 February 2023.
In order to prepare your company for the DSA, you will need to think about :
- The qualification of your company as an intermediary service provider under the DSA and the verification of the accompanying obligations;
- The review of the compliance of your website, app, platform, etc.
- Annual reporting obligations on content moderations (triggered by Member States, complaints or own initiatives) and disputes/suspensions;
- Management of redress possibilities, i.e., set-up and assist with the internal complaint handling system, out-of-court settlement or in-court litigation;
- Drafting (internal) risk management policies for compliance and liability;
- Actions against illegal content/products or services (e.g. counterfeit goods) or against service providers of illegal content/products or services (e.g. warning, suspension, termination);
- Redress against incorrect removal of content/products/services or suspension/termination of a (business) account;
- Create tailormade general terms and conditions regarding the use of the service, including (algorithmic) content moderation enabling actions against unwanted content which is not necessarily illegal (e.g. nudity, disinformation, extreme political views);
- The definition of illegal content/products/services, which remains a national competency, but is very relevant in cross border transactions;
- Systemic risk management and mitigation for very large online platforms and search engines (e.g. against fake news: response on Covid-19 content, the Ukraine crisis);
Author: Pieter-Jan Aerts, Counsel