News

The ‘Digital Services Act’ package

22.12.2020

Following the launch of public consultation by the European Commission in June 2020 on its long-awaited Digital Services Act and the submission of comments in a consultation phase up to 8 September 2020, two proposals for regulations were published on 15 December 2020: the Digital Services Act (‘DSA’) and the Digital Markets Act (‘DMA’) to regulate digital services and digital markets. As regulations, both texts would be directly applicable in the Member States; they would not need to be transposed into national law.

Goals

The regulatory package is intended to redefine the key rules of the digital economy and modernise the existing legal framework for digital services, the E-Commerce Directive. This also involves a partial amendment of the General Data Protection Regulation (GDPR). Currently, the cornerstones of the E-Commerce Directive do not consistently and adequately reflect the technical, social, and economic reality of today’s services throughout their entire life cycle, from creation to advertising and liability. It also does not seem to be ruled out that innovative providers of digital business models are deterred by the multitude of different rules, and so the suspicion suggests itself that this is definitely a strategic weakness of the EU digital economy and not only strengthens non-European offerings but also increases the dependence of EU citizens on non-European offerings. So now a harmonised regulation is to replace the patchwork of Member States’ regulations. One could hear from EU Parliament that the EU was working on ‘putting the constitution of the internet on a modern basis’.

When reading the two proposals, it quickly becomes clear that the proposed regulations are not primarily aimed at medium-sized European companies, but are intended to legislatively rein in the large (US) tech companies (especially ‘very large online platforms’) and limit their market power. Specifically, this means platforms with at least 45 million monthly active users in the EU (Article 25(1) DSA, Article 3(2) point (b) DMA). In this context, the liability and responsibilities of digital services are to be regulated more clearly and, above all, more uniformly across the EU (for example, with regard to the reporting and deletion of illegal content) in order to counter the specific risks and better ensure respect for the fundamental rights of users. The legal obligations are designed to ensure a modern system of cooperation in the control of large platforms and guarantee effective enforcement. In addition, so-called gatekeeper platforms will be regulated more closely to address market imbalances and create a level playing field in European digital markets. Thus, the DMA in particular could significantly influence not only the competitive conditions of digital markets in the EU, but also the business models of major players worldwide.

With regard to antitrust regulation, the legislative package now includes measures which the European Commission had published for consultation back in June. Under the term ‘new competition tool’, the European Commission had presented market investigation instruments that are intended to address certain structural competition problems which, in the opinion of the European Commission, could not be addressed or could not be addressed effectively with the existing antitrust regulations. (See also: European Commission mulls new market investigation tool) The intervention options published for consultation (e.g. unbundling) were very far-reaching because they do not require a breach of antitrust law and were below market dominance thresholds. Mention should also be made here of the German legislator’s plan to tighten national abuse control for digital companies with market power with the so-called GWB Digitalisation Amendment to the German Act Against Restraints of Competition (ARC) (see also: Altmaier welcomes new European rules on Big Tec). It will now be seen to what extent the DMA, which is aimed at harmonisation within the Union, will influence the initiatives in Germany.

Content (overview)

The DSA is designed to ‘rein in’ the large tech providers and make the Internet ‘safer’ (‘a safer and more transparent online environment for consumers’). In particular, it regulates depictions of child sexual abuse, copyrighted material and non-consensual sharing of private images, ‘online stalking’, terrorist content, for which a separate ‘Terrorist Propaganda Regulation’ will impose a deletion obligation of one hour, discriminatory content in general, and punishable hate comments. The term ‘’illegal hate speech’ is not legally defined; interestingly, it only appears in the recitals, but not in the enacting terms of the proposal. In Germany, the application and interpretation of the term would therefore be linked to the offences of the German Criminal Code. The proposals only partially address the problems of the German Network Enforcement Act (NetzDG), such as the partial outsourcing of judicial tasks to private parties, as well as justified fears of ‘overblocking’: At least a right of objection is provided for deleting content or blocking accounts; furthermore, anyone who ‘regularly’ reports content as illegal, even though it is not, is to be warned and then (temporarily) excluded from the reporting system. Whether this will actually effectively prevent ‘overblocking’ remains to be seen in practice.

With regard to platform liability, the proposal is based on the principle of the E-Commerce Directive, according to which platforms generally have no obligation of their own to check illegal content and are only liable if they have been informed accordingly (‘notice and takedown’ or ‘notice and action mechanism’, Article 14 DSA). The platform must therefore prove in individual cases that it has no ‘actual knowledge’ of illegal content on its pages or that it has acted ‘without undue delay’ to remove the content or block access to it.

Another central aspect is advertising transparency (Article 24 DSA): The provider must clearly disclose (i) that it is an advertisement, (ii) who has placed it, and (iii) which factors were decisive for the user to see this particular advertisement. Controversial so-called microtargeting must also be disclosed (this is a communication or advertising strategy in which certain content is specifically addressed to certain groups of people, often in a political context).

One new development is the obligatory annual assessment of ‘significant systemic risks’ stemming from the use of a service or platform (see Article 26 onwards DSA). For example, it should be described whether moderation and recommender systems as well as the handling of advertisements pose a systemic risk, what consequences the dissemination of illegal content has on the freedom of information and expression as well as on people’s private lives, or to what extent the use of fake accounts has negative consequences for health, for minors, for civil discourse or electoral processes. It then seeks to list various mitigation measures taken, such as stopping advertising payments for relevant content or expanded visibility of reliable sources of information.

The DMA addresses providers of social networks, search engines, cloud services, video platforms, operating systems, and advertising networks that already have or will soon reach a certain size and market power (so-called gatekeepers). For the purpose of greater choice and freer competition, abuse of market power should now be able to be prevented in advance (ex-ante); up to now, the competition authorities of the Commission can only intervene ex-post. The regulation that providers who want to combine user data from various of their own services or from third-party providers need the user’s consent to do so is likely to become particularly relevant.

This is reminiscent of the sensational Facebook decision by the Federal Cartel Office: at the behest of the German antitrust authority, Facebook had to stop combining user data and the decision was provisionally confirmed by the German Federal Court of Justice. The background is that the social network also processed user data that is collected during internet use independent of the Facebook platform. In the Federal Cartel Office’s view, Facebook exploited its dominant position on the German market for social networks by giving the network’s private users the choice of either accepting Facebook’s terms of service and agreeing to the use of their external data or not being able to use Facebook at all. (See also: German Federal Court of Justice sets Facebook limits on data use).

Another cornerstone of the DMA: Users must be allowed to uninstall pre-installed software applications (see Article 6(1) points (b) and (c) DMA): ‘To enable end user choice, gatekeepers should not prevent end users from un-installing any pre-installed software applications on its core platform service and thereby favour their own software applications’; this would also mean that alternative app stores must be permitted in each case.

Enforcement and penalties

Specific supervision with a wide range of powers is planned; new supervisory authorities in the Member States are just as conceivable as integration into existing regulatory structures. For the regulatory area of the DSA, it is envisaged that the Member States will appoint ‘Digital Services Coordinators (Article 38 onwards DSA), who are to monitor the enforcement of the regulation and ensure in a joint committee that it is applied uniformly in the EU. However, it may be argued against such an executive organisation already known from the GDPR (European Data Protection Board (EDPB), Article 68 GDPR) that the decentralised principle has not always worked well so far in the area of data protection. The Commission is also to play a stronger role in monitoring; for example, it can impose penalties itself for violations. It remains to be seen whether these Commission competences will remain in the further procedure; in the proposal for the GDPR, the Commission had also granted itself far-reaching competences, which in the end could not be asserted in the trilogue. With regard to supervisory structures, it will be important that they grow in line with the scope of new competences, including financially and in terms of personnel, and that there is no uncoordinated coexistence with existing supervisory authorities.

In terms of penalties, the two proposals for regulations even eclipse the GDPR, which in its Article 83 provides for fines of up to 4% of the total worldwide group turnover achieved in the previous year: In the DSA it is up to 6%, in the DMA even up to 10%. The DMA also provides for the ultima ratio of excluding companies from the European market or forcing groups to split up, i.e. to separate business areas, for example in the case of ‘repeat offenders’. Critics fear that such a massive penalty framework could also harm the development of the digital market in the EU.

Outlook

The ‘Digital Services Act’ package is a kind of ‘European German Network Enforcement’, extended by transparency obligations, stricter requirements for advertising, a uniform approach to illegal content, and other cornerstones. However, contrary to the demands of some consumer associations and politicians, it does not include a general ban on personalised advertising or tracking, nor does it provide for strict interoperability obligations. In terms of its impact and scope, however, the package is comparable to the General Data Protection Regulation (GDPR) or the recently reformed Copyright Directive. It will be interesting to see how the supervisory authorities position themselves as this moves forward. Uniform enforcement seems questionable, as the enforcement practice to date (especially with regard to data protection) is very inconsistent (for example, in a direct comparison between Ireland and France or Germany). It also remains to be seen whether the different platforms will be lumped together by the supervisory authorities in the ‘fight against illegal content’. Incidentally: What is illegal will continue to be determined by each Member State itself. For the strict DMA, the following applies: In competition law, new bans or control instruments may only intervene if there is market failure.

It also remains to be seen how the package harmonises with the GDPR, for example with the principle of data minimisation (Article 5(1) point (c) GDPR): For example, the very large online platforms should (be able to) be obliged to grant comprehensive access to stored data, as long as no business secrets are affected. In addition, archives are to be set up in order to be able to detect disinformation and illegal advertising content.

It is still too early to draw a final conclusion. The wording in both proposals for regulations also leaves a lot of room for interpretation. Where do we go from here? The proposals will now make their way through the EU Parliament and the Council of the European Union. As the process continues, it can also be expected that the internet giants will influence the legislative process within the scope of their possibilities. In any case, the opportunities for innovative European offers and start-ups as well as for the protection of users and consumers are obvious.

Digital Business
Data Privacy
Commerce & Trade
Antitrust & Competition
Data Competence Center

Share