Resistance is Futile – How Private International Law Will Undermine National Attempts to Avoid ‘Upload Filters’ when Implementing the DSM Copyright Directive

Last week, the European Parliament adopted the highly controversial proposal for a new Copyright Directive (which is part of the EU Commission’s Digital Single Market Strategy). The proposal had been criticized by academics, NGOs, and stakeholders, culminating in an online petition with more than 5 million signatures (a world record just broken by last week’s Brexit petition) and public protests with more than 150,000 participants in more than 50 European (although mainly German) cities.

Under the impression of this opposition, one of the strongest proponents of the reform in the European Parliament, Germany’s CDU, has pledged to aim for a national implementation that would sidestep one of its most controversial elements, the requirement for online platforms to proactively filter uploads and block unlicensed content. The leader of Poland’s ruling party PiS appears to have recently made similar remarks.

But even if such national implementations were permissible under EU law, private international law seems to render their purported aim of making upload filters ‘unnecessary’ virtually impossible.

Background: Article 17 of the DSM Copyright Directive

Article 17 (formerly Article 13) can safely be qualified as one of the most significant elements of an otherwise rather underwhelming reform. It aims to address the so-called platform economy’s ‘value gap’, i.e. the observation that few technology giants like ‘GAFA’ (Google, Apple, Facebook, Amazon) keep the vast majority of the profits that are ultimately created by right holders. To this end, it carves out an exception from Art 14(1) of the e-Commerce Directive (Directive 2000/31/EC) and makes certain ‘online content-sharing service providers’ directly liable for copyright infringements by users.

Under Art 17(4) of the Directive, platforms will however be able to escape this liability by showing that they have

(a) made best efforts to obtain an authorisation, and

(b) made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information; and in any event

(c) acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from, their websites the notified works or other subject matter, and made best efforts to prevent their future uploads in accordance with point (b).

This mechanism has been heavily criticised for de-facto requiring platform hosts to proactively filter all uploads and automatically block unlicensed content. The ability of the necessary ‘upload filters’ to distinguish with sufficient certainty between unlawful uploads and permitted forms of use of protected content (eg for the purposes of criticism or parody) is very much open to debate – and so is their potential for abuse. In any case, it does not seem far-fetched to assume that platforms will err on the side of caution when filtering content this way, with potentially detrimental effects for freedom of expression.

In light of these risks, and of the resulting opposition from stakeholders, the German CDU has put forward ideas for a national implementation that aims to make upload filters ‘unnecessary’. In essence, they propose to require platform hosts to conclude mandatory license agreements that cover unauthorised uploads (presumably through lump-sum payments to copyright collectives), thus replacing the requirement of making ‘best efforts to ensure the unavailability of unlicensed content’ according to Art 17(4) of the Directive.

Leaving all practical problems of the proposal aside, it is far from clear whether such a transposition would be permissible under EU law. First, because it is not easily reconcilable with the wording and purpose of Art 17. And second, because it would introduce a new exception to the authors’ rights of communication and making available to the public under Art 3 of the Information Society Directive (Directive 2001/29/EC) without being mentioned in the exhaustive list of exceptions in Art 5(3) of this Directive.

Private International Law and the Territorial Scope of Copyright

But even if EU law would not prevent individual member states from transposing Art 17 of the Directive in a way that platforms were required to conclude mandatory license agreements instead of filtering content, private international law seems to severely reduce the practical effects of any such attempt.

According to Art 8(1) Rome II, the law applicable to copyright infringements is ‘the law of the country for which protection is claimed’ (colloquially known as the lex loci protectionis). This gives copyright holders the option to invoke any national law, provided that the alleged infringement falls under its (territorial and material) scope of application. With regard to copyright infringements on the internet, national courts (as well as the CJEU – see its decision in Case C-441/13 Hejduk on Art 5(3) Brussels I) tend to consider every country in which the content can be accessed as a separate place of infringement.

Accordingly, a right holder who seeks compensation for an unlicensed upload of their content to an online platform will regularly be able to invoke the national laws of every member state – most of which are unlikely to opt for a transposition that does not require upload filters. Thus, even if the German implementation would allow the upload in question by virtue of a mandatory license agreement, the platform would still be liable under other national implementations – unless it has also complied with the respective filtering requirements.

Now, considering the case law of the Court of Justice regarding other instruments of IP law (see, eg, Case C-5/11 Donner; Case C-173/11 Football Dataco), there may be room for a substantive requirement of targeting that could potentially reduce the number of applicable laws. But for the type of online platforms for which Art 17 is very clearly designed (most importantly, YouTube), it will rarely be possible to show that only audiences in certain member states have been targeted by content that has not been geographically restricted.

So either way, if a platform actually wanted to avail itself of the option not to proactively filter all uploads and, instead, pay for mandatory license agreements, its only option would be to geographically limit the availability of all content for which it has not obtained a (non-mandatory) license to users in countries that follow the German model. It is difficult to see how this would be possible… without filtering all uploaded content.