News

Higher Regional Court decisions restrict Facebook’s “virtual domiciliary right”

08.10.2018

When removing content, Facebook may not place its own community standards above the fundamental right of social network users to freedom of speech. This was decided by the Higher Regional Court of Munich by order of 24 August 2018 (18 W 1294/18). The Munich judges once again firmly rejected an unrestricted “virtual domiciliary right” of social networks, as already seen in similar summary proceedings in July (order of 17 July 2018 - 18 W 858/18). Facebook therefore may not determine at its own discretion whether a post is to be deleted or not. Rather, the Higher Regional Court of Munich has now ruled “that a permissible expression of opinion may not be removed from the platform.”

Proceedings triggered by breach of community Standards

The latest court case is based on the comment of a Facebook user, who wrote in the context of a debate with another user: “Unfortunately, I can no longer compete with you argumentatively; you are unarmed and that would not be particularly fair of me”. Facebook then deleted the comment on the grounds that the content violated the network’s community standards as “hate speech”.

Higher Regional Court of Munich: Indirect third-party effect of fundamental rights must be taken into account

The user concerned successfully fought in court to have the content restored. The Higher Regional Court of Munich ruled that the removed content “evidently” did not constitute hate speech, since the user's post was not to be regarded as a direct attack on a person “on account of their race, ethnicity, national origin, religious affiliation, sexual orientation, gender identity or on account of disability or illness”. Rather, it was a statement covered by freedom of speech in the context of an individual personal dispute.

Moreover, the Munich judges were of the opinion that the decision on the admissibility of a post was not at the discretion of Facebook. A corresponding clause in the community standards of Facebook, which suggested such a unilateral right, was therefore invalid. It states: “We may remove any content you post on Facebook if we believe that it violates [...] our guidelines”. According to the court, this regulation contradicts sec. 241(2) of the German Civil Code, which obliges both contracting parties to take account of the other party’s legal interests. Due to the indirect third-party effect of fundamental rights, the user’s freedom of speech had to be taken into account in this respect. In the specific case, it was of decisive importance that social networks served the purpose of providing users with a “public marketplace for information and the exchange of opinions”. On the merits, the court’s reasoning appears to be inspired by the “public forum doctrine”, which was developed in previous jurisprudence and legal scholarship (c.f. “Fraport” decision of 22 February 2011 by the German Federal Constitutional Court – 1 BvR 699/06).

Trend in case law?

With the two decisions of the Higher Regional Court of Munich, a trend seems to be emerging in case law to apply the fundamental right enshrined in Article 5(1) of the Basic Law of the Federal Republic of Germany to assess the removal of Facebook content. The Regional Court of Frankfurt am Main had already ruled accordingly in May (decision of 14 May 2018 - 2-03 O 182/18). On the other hand, in similar proceedings in June (decision of 25 June 2018 - 15 W 86/18), the Higher Regional Court of Karlsruhe emphasised that fundamental rights were primarily “rights of the citizen to defend himself against state intervention”. Nevertheless, the judges at the Higher Regional Court Karlsruhe also assumed the indirect third-party effect of the freedom of speech and in the specific case merely did not object to Facebook’s assessment of the illegal content and the temporary blocking of the user.

Increasing regulation of social Networks

The court decisions put Facebook in a difficult position, as the company is also coming under increasing pressure from the opposite direction. In Germany, the continuing criticism of the company’s lax handling of criminal content culminated in the German Network Enforcement Act (NetzDG), which obliges Facebook and other platforms to delete certain content within strict deadlines. Further attempts to regulate online platforms have recently been made, particularly at the European level: in March, the Commission initially issued a non-binding Recommendation on measures to effectively tackle illegal content online. The new Audiovisual Media Services Directive, which is expected to be adopted this autumn, will prohibit content that incites violence, hatred and terrorism. On 12 September 2018, the Commission also announced a Regulation to remove terrorist content from the internet. Among other things, this will contain a “one-hour rule”, according to which terrorist content must be deleted within one hour.

Based on these rulings, it remains to be seen whether social networks such as Facebook will delete too many or too few posts in the future.

These articles may also be of interest to you

EC Recommendation on measures to tackle illegal content online

ECJ: Facebook fanpage operators co-responsible for data processing

Data Privacy
Digital Business
Regulatory and Governmental Affairs
IT & Outsourcing

Share