The Digital Services Act: A Tight Timeline for Compliance?

The Digital Services Act: A Tight Timeline for Compliance?

Austria
Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

The Digital Services Act ("DSA") was published in the Official Journal of the European Union on 27 October 2022. This means that the Act, targeting numerous online services, will enter into force only 20 days later, on 16 November 2022.

As part of the European Commissions' digital strategy for the EU, the DSA aims to transform the internet into a safer space for users in Europe. Its focus is on protecting the fundamental rights of users and tackling illegal content and misinformation in the context of digital services. By creating extensive obligations for providers of digital services regarding, e.g. targeted advertisements, content moderation, transparency for terms and conditions, internal "complaint-handling systems" and the like, websites and online platforms are to become more transparent.

Who is affected?

The DSA targets many online services, ranging from websites to online/internet infrastructure services and online platforms.

Pursuant to Art 2, the DSA applies to all information society services (or "intermediary services") offered to recipients that are established or located in the EU, irrespective of where the service provider is established. Intermediary service means (i) a mere conduit service, providing access to or transmitting information in a communication network, (ii) a caching service, transmitting information in a communication network also involving the temporary storage of that information, and (iii) a hosting service, consisting of the storage of information provided by a user (or "recipient") of the service.

This means that the DSA applies not only to internet access providers, but also to cloud and webhosting services, search engines, online platforms such as social media platforms, online marketplaces, app stores and collaborative economy platforms, among others. All those services, whether established in the EU or not, will have to comply with the DSAs' regulations.

SMEs will benefit from numerous provisions adapted to their respective size and capabilities, without reducing their accountability under the DSA. Likewise, there will be specific provisions which provide for additional obligations for very large online platforms ("VLOP") or online search engines ("VLOSE") (i.e. platforms or search engines with an average of at least 45 million active users per month).

What are the key points?

The DSA explicitly states that intermediary services have no obligation to monitor all the information they transmit or store "to seek facts or circumstances indicating illegal activity". This means that the widely discussed obligation to implement upload filters did not make it into the final text of the DSA.

Other things that affect all intermediaries include:

  • Intermediaries must follow standardised rules upon the receipt of an order from the relevant national court or authority to act against illegal content and inform the court/authority without undue delay about its fulfilment.
  • Intermediaries must designate a single contact point to enable direct communication with authorities, the Commission and other relevant bodies. Also, a single point of contact for users of the service to communicate with the intermediary must be designated.
  • Transparency:
    • Terms and conditions: Information must be provided in the terms and conditions about internal policies, procedures, measures and tools used for content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of the internal complaint handling system. This information must be in clear, plain, intelligible, user-friendly and unambiguous language, and must be publicly available in an easily accessible and machine-readable format.
    • Reporting obligation: Intermediaries must publish reports on content moderation at least once a year containing information on the type of illegal content, the time needed to delete the content, the content moderation engaged at own initiative, training and assistance measures to persons in charge of content moderation, measures that affect the availability of information provided by users, the number of complaints received through the internal complaint-handling systems, any automated means used for content moderation, etc. Additional semi-annual reporting obligations for providers of online platforms apply.

Additional obligations for online platforms

  • Action mechanisms must be in place to allow users to notify intermediaries about illegal content. The obligation to implement an internal complaint-handling system that enables users who have submitted a notification or are concerned by an intermediary's decision (e.g. to remove content, to suspend or terminate the users account, etc.) to electronically lodge complaints against the decision taken free of charge. The obligation to suspend users if they frequently provide illegal content. Users must also have the option to select a certified out-of-court dispute settlement body.
  • Notices of trusted flaggers (who must be competent for identifying illegal content, independent and objective) must be prioritised and processed without undue delay.
  • Prohibition to use "dark patterns": The interface of an online platform must not be designed, operated or organised in a way that deceives or manipulates the users of the service or impairs their ability to make a free and informed choice.
  • Information on advertisements: Users must be able to clearly, concisely and in real time identify that the information at hand is an advertisement, and establish the identity of the advertiser who paid for the advertisement and the parameters used to determine the recipient of the advertisements and how to change them. Also, a prohibition to present advertisements based on profiling using special categories of personal data.
  • Obligation to set out in plain and intelligible language in the terms and conditions the main parameters used to determine the order of information presented to the user. The user must be provided with a function to modify the parameters.

For online platforms allowing consumers to conclude distance contracts with traders, the following additional obligations apply:

  • The trader must be identified by the intermediary through the means of an identification document or other electronic identification.

Additional obligations for VLOPs and VLOSEs

  • Reporting obligations on the number of monthly active users (starting from 17 February 2023 and at least semi-annual thereafter).

  • Risk assessment for risks stemming from the design or functioning of the service and its related (algorithmic) systems and, based on its outcome, the implementation of reasonable, proportionate and effective mitigation measures.

  • The obligation to assess compliance with the DSA at least once a year through an independent audit at its own expense.

  • Additional advertising transparency obligations.

  • Access to data necessary to monitor and assess compliance with the DSA for the Digital Services Coordinator and the Commission and the obligation to explain the design, logic, functioning and testing of the algorithmic systems.

  • The obligation to implement an independent compliance function with sufficient authority and resources to ensure compliance with the DSA.

  • Additional reporting obligations.

Pursuant to Art 49 DSA, Member States must designate the authorities responsible for supervising the intermediaries and enforcing the DSA, as well as a Digital Services Coordinator responsible for coordination at a national level. Notably, the Commission itself will be the administrative body responsible for supervising and enforcing the DSA vis-à-vis VLOPs and VLOSEs to prevent a "bottleneck" effect in DSA enforcement due to inadequately staffed or funded national authorities.

The penalties following an infringement of the DSA are quite harsh and can amount to up to 6 % of the annual worldwide turnover achieved by the intermediary in the preceding financial year. This means that fines are even higher than under the GDPR (4 % of worldwide turnover).

Timeline and to-dos

After being published in the Official Journal of the European Union, the DSA will enter into force by 16 November 2022 and will be effective only 15 months later by 17 February 2024.

However, certain obligations for VLOPs and VLOSEs will apply as early as 16 November 2022. These are, among others, the reporting obligation on the number of monthly users, the Commission's competence to adopt delegated acts to designate online platforms or search engines as VLOPs or VLOSEs, to lay down rules for the performance of audits and to establish the technical conditions for data access.

Considering the extensive internal changes (organisational, technical and procedural) required for intermediaries to comply with the DSA's regulations and the severe fines, intermediaries should waste no time in setting the necessary requirements for DSA compliance.

Steps to follow

  1. Analyse and classify: What type of intermediary is my organisation/company? Which provisions are applicable?

  2. Determine the status quo: Where is the organisation/company already compliant with the DSA and where is it not? Identify deviations from the DSA's requirements.

  3. Lay the foundations: Start implementing the DSA by creating appropriate technical, personnel, structural and organisational measures.

  4. Monitor: The DSA provides for various possibilities for the Member States and the Commission to adopt (delegated) national acts. Monitoring the legislation of the EU and the relevant Member States will therefore be mandatory.

  5. Check: Conduct an internal or external audit for compliance before the applicable cut-off date.

By Florian Terharen, Associate, Schoenherr