Digital Services Act (DSA)

On 17 February 2024, Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a single market for digital services and amending Directive 2000/31/EC (Digital Services Act) - the DSA Regulation for short - came full into effect.

From the end of August 2023, this Regulation is already effective for very large online platforms (VLOPs) and very large online search engines (VLOSEs), entities with more than 45 million monthly active users in the EU.

The main objective of the DSA Regulation is to modernise and improve the legal framework for digital services in the European Union to better protect users in the online environment. The DSA gives recipients of relevant services new rights to gain more control over the services they provide, which will help to create a safe, predictable and trustworthy online environment. In addition, the DSA promotes transparency throughout the digital ecosystem and the development of SMEs.

The DSA Regulation applies to online intermediary service providers, which include internet access service providers, caching services, online marketplaces, social networks, cloud storage, online platforms, web hosting and online search engines. One of the main objectives of the DSA Regulation is to increase the competitiveness and ease of doing business of European SMEs in the online environment. The DSA therefore imposes most of the obligations on large or very large global platforms, while small and micro businesses are exempted from many of the obligations.

The main obligations under the DSA are:

Contractual terms and conditions

Intermediary service providers must make available their terms and conditions, which clearly state any restrictions on the use of their service and information on content moderation, and ensure that this information is clear, accessible and unambiguous for users. They must also inform users of any significant changes to the terms and conditions.

Online Marketplace

The new rules promote consumer safety on online marketplaces. Marketplaces must not allow trading by operators about which they do not have sufficient information. Online marketplaces are also obliged to make all pre-contractual information on product safety available and must additionally inform consumers about the illegality of the goods or services they have purchased.

Rules for content moderation

Any restrictions applied by the provider in connection with the use of its service will have to be properly and clearly explained. Users of the hosting service must be given a justification for any restriction of service due to a breach of the terms and conditions or illegality of the content and must be properly advised of their remedies.

Recommender systems

Online platforms must allow users, where possible, to choose their preferred parameters for the recommender system. Very large online platforms must offer the user at least one option to view content that is not based on profiling.

Notification of illegal content

Users must be able to easily notify illegal content on the hosting provider's service to the hosting provider, and the notifier must be made aware of how their notification has been handled.

The DSA regulation also stipulates that online platform providers must cooperate with so-called "trusted flaggers", which will be certified entities tasked with flagging illegal content on online platforms. Platforms will have to deal with their notifications on a priority basis. However, there is no presumption of accuracy of these notifications.

Online interfaces design

DSA prohibits so called dark patterns which are elements in online interfaces that mislead or manipulate service recipients or otherwise interfere with their ability to make free and informed decisions.

Protection of minors

Providers of online platforms are required by the DSA to take appropriate and reasonable measures to ensure a high level of privacy, security and protection of minors when using their services. This means that they must implement technical and organisational measures that minimise the risk of minors being exposed to inappropriate content or dangerous situations online. In addition, providers should not present advertisements based on profiling using minors' personal data.

Further obligations of providers

Providers will have to comply with transparency obligations such as publishing reports on content moderation, the number of disputes with users or the number of active users. Hosting service providers will also have to report suspected crimes of which they become aware. In addition, online platforms will be required to cooperate with trusted flaggers, establish an internal complaint handling system, participate in out-of-court dispute resolution initiated by users or upload all content moderation decisions to a publicly accessible Transparency Database. Violations of these obligations are punishable by a penalty of up to 6% of turnover.

Illegal content

Illegal content is not defined by the DSA. What is illegal must be defined by EU or national legislation. The illegality of content will be assessed and evaluated by the providers themselves if they become aware of it. Parties may refer any disputes about the nature of the content to a court or a specialised out-of-court dispute resolution body.

Measures against illegal content may also be imposed by a public authority if it has the power to do so under a specific law. In terms of liability for illegal content, the DSA Regulation only modifies the existing rules laid down in the e-commerce Directive. It therefore remains the case that a provider is not liable for illegal content of which it has no specific knowledge. The DSA now adds that if a user notifies illegal content to the provider, the provider can no longer claim that it did not know about it. If the provider does not act even after notifying the illegal content, it becomes jointly liable for the content and any affected recipient of the service can claim damages from the provider.

Systemic risks

In addition to illegal content, the DSA also distinguishes a category of so-called systemic risks, which are associated exclusively with the operation of very large platforms. These platforms shall continuously analyse the occurrence of systemic risks associated with the provision of their services. Systemic risks include, for example, adverse impacts on fundamental human rights, civil discourse, electoral processes or public security. The purpose of the rules is to ensure that platforms do not feed negative societal phenomena, for example to increase the reach of contributions, but to mitigate them. It should also be noted that the term "disinformation" does not appear once in the entire DSA. The adequacy of the platforms' measures will be assessed by the European Commission. Furthermore, the DSA will give the scientific community access to data from very large platforms to investigate systemic risks.

Digital services coordinator

Under the DSA Regulation, so-called "Digital Services Coordinators" shall be appointed within each EU Member State. Their task is to supervise compliance with the obligations under the DSA by online intermediary service providers established in a Member State. In the Czech Republic, CTU will be the coordinator. The role of the coordinator is also to coordinate the application of the DSA Regulation at national and European level in cooperation between national authorities and the European Commission. The role of CTU, on the other hand, will not be to assess the (un)legality of content or disputes resolution between providers and users. Nor does the Coordinator have the power to order the removal of illegal content or the blocking of websites.

The CTU can start actively exercising its competences under the DSA only after the relevant adaptation law, which is currently still in the legislative process, is approved and comes into effect.

Top