News

Copyright reform: platform liability and upload filters

25.03.2021

Now things are getting serious: after intensive discussions, the German Federal Cabinet (Bundeskabinett) passed a Draft Act on Adjusting Copyright Law to the Requirements of the Digital Single Market on 3 February 2021. Its purpose is to adapt German law to the requirements of Directive (EU) 2019/790 of 17 April 2019 on copyright and related rights in the Digital Single Market (the “DSM Directive”) (for background information, see “Key regulatory approaches in the Federal Justice Ministry's draft on copyright reform”). A core aspect of the draft is the new Copyright Services Provider Act (Urheberrechts-Diensteanbieter-Gesetz) (the “Draft Act”), which contains regulation on platform liability and the use of upload filters and is intended to implement Article 17 DSM Directive. 

Implementation the DSM Directive

Harsh criticism already arose in the lead up to the DSM Directive, which was adopted by the European legislator in April 2019. One need only recall the EU-wide demonstrations against upload filters, the use of which was feared as a consequence of the provision on platform liability (Article 13 of the draft DSM Directive). Opponents of the draft expressed concerns that, in order to reduce their liability risk, platforms would, in case of doubt, use upload filters to block more content from being uploaded than was legally required (on “overblocking” in the context of the German Network Enforcement Act, please see “Online Hate Speech und Plattformregulierung: Update zum NetzDG” (only available in German)). Considering the widespread use of quotations, parodies and memes, the draft’s opponents also warned of algorithm-based filters proneness to error.

 

Motivated by these concerns, Poland brought an action for annulment of Article 17 DSM Directive before the ECJ in May 2019. Poland argued that the use of upload filters is not compatible with freedom of expression and information, and that Article 17 DSM Directive violates EU law. If the ECJ declares the provision null and void, this would have far-reaching consequences, not least for the Draft Act. The proceedings are currently still pending. In any case, reports of the November 2020 hearings indicate that the ECJ is taking the case very seriously.

 

Copyright responsibility for user-generated content

The Draft Act tightens the responsibility of platforms (e.g. YouTube) for the content that can be accessed on them. The platforms should only not be liable if, on the one hand, they have exerted their “best possible” efforts to acquire copyright licences (Section 4(1) Draft Act) and, on the other hand, comply with the requirements for preventing unauthorized uses (Sections 7 to 11 Draft Act). This is to be done “in accordance with high standards customary in the industry” and in a proportionate manner (Section 1(2) Draft Act). Platforms can therefore no longer invoke the liability privilege for host providers (Section 1(3) Draft Act).

Pre-check procedure and upload filters

If the user wishes to upload content onto a platform covered by the Draft Act (service providers under Sections 2 and 3 Draft Act), the platform must first perform a pre-check:

    • In doing so, it checks whether a blocking request and the information required for this (e.g. as a reference file) are available for comparison (Section 7(1) Draft Act) or whether it has already acquired a licence.

    • If there are adequate grounds to block and no licence has been procured, the platform will further check whether the upload is initially permitted for other reasons (see below).

    These pre-checks will likely be automated, as the Draft Act intends to cover platforms that provide a “large amount” of user-generated content (Section 2(1) no. 1 Draft Act). For example, YouTube’s users upload several hundred hours of video material per minute. Sections 9 to 11 Draft Act additionally apply when such automated processes are used, i.e. upload filters (Section 7(2) Draft Act).

Presumably permitted uses

In order to prevent overblocking by upload filters, a “presumably permitted” upload shall initially remain on the platform (Section 9(1) Draft Act). This applies in particular to quotations, caricatures, parodies and so-called pastiches as well as other legally permitted uses under the German Copyright Act (Section 5 Draft Act), the existence of which is rebuttably presumed under three conditions (Section 9(2) Draft Act). Firstly, the content must contain less than half of a work copyrighted by another party and must combine this part with other content. Secondly, it must be either a marginal use or the uploader must mark the content as permitted:

    • A marginal use exists if the copyrighted work constitutes less than half of the content and it is a film or audio track of less than 15 seconds in length, a text with up to 160 characters or an image of up to 125 kilobytes in size (Section 10 Draft Act). In addition, the use must not serve any commercial purposes. Authors consider this an impermissible restriction on their intellectual property. They point out that short uses are becoming increasingly important (e.g. on TikTok) and interfere with the commercial exploitation of rights.

    • If a use cannot be considered marginal, the user can mark the upload to the platform as “legally permitted” (“pre-flagging”, Section 11(1) Draft Act) – but only if the content is to be blocked after the pre-check. If content that was initially not blocked after the pre-check and has already been uploaded, is to be subsequently blocked, the uploader can also subsequently mark the content as “legally permitted” within 48 hours after notification (“post-flagging”, Article 11(2) Draft Act).

Internal complaint procedure

Following this procedure users and right holders have the right to file a complaint against the upload or non-upload (Section 14 et seq. Draft Act). The platform must review and decide on the content within one week.

Particularly privileged in this regard are “trusted right holders”. They can force the immediate blockage of content against which a complaint has been lodged (“red button”). In order to do so, they must declare that the upload is not presumably permitted and the continued public display constitutes a considerable impairment of the economic exploitation of the work (Section 14(4) Draft Act). Who is considered "trustworthy" is decided by the platforms themselves.

Outlook

The first reading of the Draft Act in the German Federal Parliament (Bundestag) is scheduled for the end of March. However, the requirements of the DSM Directive and the (now) increasing time pressure make it unlikely that substantial changes will be made to the draft. After all, the Draft Act is to enter into effect before the summer break. Only if the ECJ should declare Article 17 DSM Directive null and void might the provisions of the Draft Act also become invalid.

However, at present it is clear that users of large platforms will in all likelihood have to adjust to a close-knit structure of automated checks and subsequent complaint procedures in the future. The courts will have to clarify many details in case law.

Media
Digital Business
Data Privacy
Intellectual Property

Share