19
Fri, Apr
38 New Articles

New European Regulation of Digital Services – Part 1 – Digital Services Act

Digital Services Act

Czech Republic
Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

The European Commission has presented a proposal for two regulations that aim to harmonize the rules of the digital space in the European Union. The first regulation called the Digital Services Act focuses on the regulation of digital services of all kinds, and primarily protects the recipient of the services from illegal content, infringement of their consumer rights, and other risks.

At the EU level, the market for online intermediary services – starting from communication applications, data warehouses or online stores to the biggest social networks – to date have been primarily regulated by directives. The key one being the e-Commerce Directive, which was adopted 20 years ago, which in the online environment seems like an eternity ago. During this time, not only has a range of new technologies and services appeared, but also difficulties connected with them, one of them being the enormous influence and economic power of the strongest players. Some countries have responded to these new challenges with their own national legislation, which contributes to the regulatory fragmentation across the EU and weakens its internal market.

The European Commission ("Commission") has been preparing an ambitious solution to the present ills of the digital world. These efforts led to the presentation of a proposal of two new regulations at the end of last year – the Digital Services Act ("DSA") and the Digital Markets Act ("DMA"). The proposals not only follow a two-pillar regulatory structure as was planned before, but their form also confirms the Commission's efforts to re-unite European legislation in the areas concerned. Once in force, the regulations will be applicable in all the EU states. At present, the proposals are open for feedback from the general public.

This part of the article will deal with the DSA proposal. The other regulation, the DMA, focuses on the markets and the behavior of platforms on digital markets, which essentially concerns the regulation of competition. We will examine this in more detail in part two of this article.

DSA: an Overarching e-Commerce Regulation

One of the main concerns of the DSA is the fight against illegal content, goods and services and the related protection of users. Apart from that, in some respects the DSA distributes obligations proportionally so that they have less impact on smaller services or startups, the greater share of the responsibility being imposed on the largest platforms. In general, the DSA is intended to complement the existing regulation of online intermediary services primarily by the e-Commerce Directive, the DSM, and the AVMS Directive.

The DSA should generally concern the providers of intermediary services (Internet access providers, domain name registrars), which include hosting service providers (cloud and webhosting services), and platforms (online marketplaces, app stores, social networks), in which the largest platforms have a special role (platforms visited by at least forty-five million users in the EU monthly). The obligations for the superior categories of providers will also apply to all subordinate categories.

The DSA currently targets all providers of intermediary services offering their services in the EU, except for services provided to persons in the EU randomly and on a smaller scale. It aims to maintain a certain standard of consumer protection and at the same time not to place EU providers at a disadvantage in relation to third-country providers.

Liability of Intermediary Service Providers for Content – Today's Principles Will Remain Unchanged

From the perspective of providers' liability, the DSA will preserve the main pillar of the current regulation – the "notice and take down" system.  In simple terms, this means that providers essentially are not held liable for the content shared or saved by their users as long as they are not aware of it being illegal and if they remove it or block it (take down) if they detect it (usually as the result of a user's notice).

The proposal also retains the prohibition on imposing an obligation of supervision or an obligation of active detection of illegal content, so providers will not have to proactively introduce content filters, etc. However, in order not to dampen their motivation to implement such systems (be they manual, or mainly automatic), the DSA stipulates that voluntary activity as such does not exclude the applicability of the above exemptions from liability (in other words, if an automatic filter for illegal content is introduced and operated, it does not mean that the provider is aware of the presence of such content).

Specifically in case of online marketplaces and services sold online, consumer rights are especially protected against providers whom they erroneously perceive to be sellers, even though technically this is not always the case. In particular, the DSA stipulates that the exemptions from liability (pursuant to consumer law) in relation to consumers will not apply to providers of services if the services are presented in a way that makes consumers believe they are entering into a contract with the online platform itself or an entity under its control.

New Tools for Countering Illegal Content

The basic principles of accountability and the fight against illegal content, which were described above and remain unchanged, will be extended by a range of specifications, additions and new institutions.

In the first place, the DSA introduces a new definition of illegal content that should be removed from platforms according to its rules: namely, any information, goods or services that contradict EU law or the law of any EU member state, "regardless of the exact subject-matter or nature of that law". This is a very broad concept that will apparently also include e.g. dangerous products, counterfeits, services provided by unlicensed entities, etc. However, the definition of illegal content in the DSA proposal also explicitly includes all information that is in any way related to or concerns illegal content. In practice, this may mean that platforms will have to examine the context of the information stored on such platforms by their users (be it consumers or traders), as well as conduct additional inquiries in this respect. The question is to what extent this requirement is reasonable and whether this may be legitimately demanded of platforms.

In order to facilitate the reporting of illegal content, platforms will have to introduce easily accessible, user-friendly mechanisms for electronic reporting of illegal content. The reporting system must provide an option to explain why the user considers the content illegal, to insert a link to the content, and to give information about the user. The provider must inform the user of having received a notice and of the measures taken. If the content is removed or restricted, the provider then has to inform the user who had uploaded the content, as well as to inform it of the options for correction. The decision to remove or restrict content, including further information, must be published (anonymously) in the publicly accessible database administered by the Commission.

These obligations are part of the DSA’s attempt to protect the user whose content has been removed, or who is temporarily blocked by a platform for repeated violations and abuses (blocking after a notice is directly required from the platform by the DSA). In particular, platforms have to introduce an easily accessible and user-friendly system for the rapid handling of complaints against decisions to remove or restrict content, or to suspend or terminate the provision of the user's services or accounts. In other words, platforms with users' content will have to ensure that their decisions are reviewable, yet the review itself will be conducted by the very same platform – the DSA only comments there that this review may not be carried out automatically in most cases. Users may also submit an appeal against the decision of the platform to a judicial or arbitration body.

It should also be noted as regards content that the DSA implements a number of transparency obligations (such as to publish reports on the removal of illegal content, etc.). The platform should implement the institution of "trusted flaggers", whose notices should be handled as a priority, etc.

To summarize, the DSA is intended to complement and specify the existing rules for platforms regarding (illegal) content, the novelties being primarily the extended definition of illegal content and the option for users to defend themselves against the removal of their content and further decisions of the platform. This is especially significant in view of the current trend towards automatic moderation of content due to the measures of platforms, which nowadays allow for complete deletion of the user's account without any substantial defensive tools given to the user.

To counter illegal content, new rules for displaying advertisements on platforms will also be introduced. The existing rules will be extended by the DSA, through new obligations to clearly mark ads, to give information on advertisers and to explain why a particular ad is shown to a particular user. We will see how this obligation is implemented in practice and hope that it does not take up further space on websites.

Special Regime for the Largest Platforms

The largest (very large in the wording of the DSA) platforms, such as transnational social or video-sharing networks will have to comply with additional obligations and rules,  which will be applied if the number of visitors to their sites reaches approximately 45 million or more per month (the number should stay at around 10% of the EU population).

Most importantly, the DSA requires that such platforms themselves identify all possible risks that may arise from their services and take measures to mitigate these risks, whether they arise from the possibility of illegal content sharing, negative effects on fundamental rights, or possible manipulation of some services leading to various negative effects on society. This analysis should among others involve representatives from service recipients or representatives from groups potentially affected by the services of the platform. The obligation to identify and mitigate risks is also linked to the obligation of large platforms to undergo auditing and to publish related information.

Another relatively subtle, but significant change may concern the sphere of recommender systems, i.e. the algorithms used by large platforms to customize the content shown to recipients. The parameters of such systems must be specified in the terms of service, as well as the possibility to adjust them. From the perspective of privacy protection and many others aspects, it is important that at least one of these parameters for content-sorting must not be based on profiling and that the recipient must always be able to easily change the method of content-sorting.

Competent Authorities and Sanctions

Responsibility for supervision of compliance with the DSA in individual countries will rest on Digital Services Coordinators – bodies that will be appointed by the member states. At the same time, the DSA will establish a European Board for Digital Services composed of representatives from Digital Services Coordinators, which should function as an advisory body to the Coordinators and the Commission.

The competences of the Digital Services Coordinators will include:

  • requesting information on possible violation of the DSA from the providers of intermediary services;

  • conducting local investigations of intermediary service providers in order to inspect and provide copies of information on suspected infringements of regulations;

  • requesting the employees or representatives of intermediary service providers to explain information concerning suspected infringement of regulations;

  • ordering of cessation of infringement, and implementation of corrective measures where appropriate;

  • imposing fines and penalties;

  • imposing provisional measures;

  • in extreme cases, requesting the management of intermediary service providers to adopt an action plan for the termination of infringement of the regulations, as well as appealing to the judiciary body for suspension of service provision.

The maximum financial penalty for a breach of obligations  according to the DSA should be determined by each member state and may amount to up to 6% of annual income or turnover of the intermediary service provider (or up to 1% if false, incomplete or misleading information is provided and not corrected, or in case of a failure to submit to local investigation). If penalties are imposed, they should not exceed 5% of the average daily turnover of the intermediary service provider. The penalty ceiling is quite high and is clearly intended to deter intermediary service providers from infringing the regulations. How the penalties will be implemented in individual member states and enforced in practice is, of course, a question to be answered in the distant future.

The Beginning of a Long Journey

The revision of the digital market regulation is undoubtedly a timely move, and if successful, may be helpful to a great number of subjects – starting from consumers and other recipients, including traders on platforms, to smaller or even the largest national providers of digital services of all kinds. Transnational giants, obviously, will not benefit much from the novelties, yet they have to be regulated for many reasons (user protection, protection of current and future competition, etc.)

We consider the DSA to be an interesting and promising proposal that still needs to be fine-tuned in many areas. What should among other things be further discussed is certainly the definition of illegal content, which is very broad and could set inappropriate requirements on digital service providers. In this respect, it is also interesting that the Commission have now discarded the concept of harmful content, meaning content that is not illegal but is still problematic, such as disinformation. It will be interesting to see whether this topic will reappear after further examinations of the DSA.

It is important to realize that the DSA imposes a number of obligations on digital service providers that in some cases are currently within their competences. This concerns particularly the sphere of illegal content: it is very often criticized in some cases when illegal content is removed by platforms or blocked by users, and there are voices that object to the excessive influence of platforms. The DSA, on the one hand, rightly calls for greater transparency in this sphere, but at the same time consolidates these obligations and competences, or even introduces some new ones (namely the blocking of users). Following that, it also introduces new means of redress and defense against the decisions of platforms, yet in practice it is mainly the platforms themselves that will be making decisions regarding these means (appeals). Although there are options for judicial defense against the decisions of platforms, the benefits of these options will be rather limited considering the length of litigation and the abundance of cases.

By Michal Nulicek, Partner, and David Slama, Junior Lawyer, Rowan Legal