Logo law and pluralism
Logo Università Bicocca

Eva Glawischnig-Piesczek v. Facebook Ireland Limited, Case C‑18/18, CJEU (Third Chamber), 3 October 2019

Abstract

Role of hosting platforms in countering hate speech.

Normative references

Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the internal market (Directive on electronic commerce)

Ruling

The E-Commerce Directive 2000/31/EC does not preclude EU member states’ national courts from ordering hosting platforms – like Facebook – to take down illegal user-generated content, as well as any content which is “identical” or “equivalent” to contents which have been declared illegal. The effect of the injunction can be extended worldwide within the framework of the relevant international law. (In the instant case, Facebook Ireland was sued before the Austrian courts in order to obtain the removal of a comment found to be harmful to the reputation of the applicant by the referring court, since it was defamatory. The Court issued an interim order against Facebook to desist from publishing text containing assertions using “identical” words or having an “equivalent” meaning as that of the comment mentioned. The Austrian Supreme Court was called on to adjudicate whether the order made against the host provider may also be extended to statements with “identical” wording or having “equivalent” content of which it was not aware. The Supreme Court referred to the Court of Justice for a preliminary ruling concerning the interpretation of Article 15 para. 1 of Directive 2000/31/EC).

Notes

In authorizing the removal of “equivalent” contents, users from whom they must come are not specified – namely whether from all users or only from the user who has disseminated the initial defamatory content. The Court seems to legitimize the removal of equivalent contents posted by any user only if the terms that integrate an equivalent expression have been clarified, with the aim to prevent host providers from making autonomous assessments on the advisability of removing the repeated contents. Moreover, the decision of the Court lacks any reference to appeal procedures against the measures taken by the host provider.
About the removal of "equivalent" contents, it has been noted that entrusting digital platforms to delete contents through automated control mechanisms, without precise instructions being given, poses the risk of systematic censorship. In his opinion, the Advocate General paid greater attention to these aspects.