skip to content

Centre for Intellectual Property and Information Law

 
Thursday, 29 October 2020

A new study has been published on the EU ban on general obligations and its implications for the upcoming Digital Services Act by CIPIL’s Dr Christina Angelopoulos and Professor Dr Martin Senftleben (Institute for Information Law (IViR), University of Amsterdam.

In the context of the current debate on the potential further harmonisation of the responsibilities of online platforms and information service providers in the Digital Services Act, the study seeks to clarify the correct interpretation of the scope of the prohibition of a general monitoring obligation in the E-Commerce Directive and the Directive on Copyright in the Digital Single Market in order to identify guidelines for the potential inclusion and further development of the general monitoring ban in the Digital Services Act.

Funding for this project was secured from Copyright for Creativity (C4C). The authors have carried out the study in complete academic independence.

Executive Summary

Since the turn of the century, the liability of online intermediaries in the EU has been governed by a regulatory system which, while not harmonising intermediary liability per se, contains a set of horizontal liability exemptions for intermediaries: the ‘safe harbours’ or ‘immunities’ that shield intermediaries from liability in a horizontal manner, i.e. from liability arising in all areas of law. The introduction of the safe harbour system rested on the idea that holding platforms liable for the illegal activity of their users would be too heavy a burden. Without the safe harbours, the liability risk would thwart the evolution of intermediaries dealing with third party content and frustrate the development of e-commerce. In the same vein, the safe harbours have been supplemented by a prohibition of general monitoring obligations. EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users.

The jurisprudence of the Court of Justice of the European Union (‘CJEU’) has shed light on several aspects of this general monitoring ban, including its anchorage in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business (Articles 8, 11 and 16 of the Charter) and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive (‘ECD’) and Article 17(8) of the Directive on Copyright in the Digital Single Market (‘CDSMD’). Accordingly, Article 15(1) ECD and Article 17(8) CDSMD can be regarded as exponents of the aforementioned fundamental rights and freedoms and the accompanying principle of proportionality. With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that:

  • any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. To the extent that filtering would fail to provide adequate protection to rights protected under Articles 8, 11 and 16 of the Charter, it cannot be imposed on intermediaries;
  • even if new legislation were to set forth obligations to monitor platform content generally, any filtering this allowed would nevertheless be excluded by the need to strike a ‘fair balance’ between all protected fundamental rights;
  • if the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality.

As to the substance of the general monitoring ban, the analysis shows that a misunderstanding of the difference between monitoring specific content and monitoring for specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. The wording of relevant provisions leaves room for this requirement of double specificity. Article 17(4)(b) CDSMD, for instance, obliges rightholders to furnish ‘the relevant and necessary information’ for ensuring the unavailability of notified works. Similarly, Article 17(4)(c) CDSMD requires a ‘sufficiently substantiated notice’ of an existing infringement. In the light of the need to safeguard fundamental rights and freedoms, a rightholder notifying only specific works, but failing to notify specific infringers, cannot be considered to have provided all ‘relevant and necessary information’ or a ‘sufficiently substantiated notice’. As a result, the notification will be incomplete and incapable of imposing a valid filtering obligation on OCSSPs. It remains an open question whether and how the requisite degree of specificity with regard to both dimensions can be attained when automated content filtering systems are employed. Regardless, other options also exist, such as infringer suspension, community moderation and warning notices. Until workable solutions have been found, a cautious approach is necessary to avoid violations of fundamental rights and freedoms, and the principle of proportionality. In particular, it is essential to prevent a snowstorm of notifications culminating in a filtering duty which de facto encompasses all kinds of protected subject matter, such as all kinds of currently exploited copyrighted works.

Finally, the analysis demonstrates that, even if the prohibition of general monitoring obligations would be expressed in a uniform manner in the ECD, the CDSMD and the DSA, its concrete meaning and impact would still depend on the nature and scope of the legal position, in respect of which a rightholder requests the imposition of duties of care, including the introduction of content moderation duties. An examination of general monitoring rules with regard to copyright, trade mark and defamation cases reveals substantial differences:

  • the preventive content moderation duties arising from Article 17(4)(b) and (c) CDSMD must be reconciled with the general monitoring ban laid down in Article 17(8) CDSMD. While the initial version of Article 17 (Article 13 in draft versions of the later Directive) and, consequently, the debate surrounding it, focused on automated content recognition and filtering technologies, the subsequent evolution of the Directive indicates that the legislator recognised the legal problems inherent to their deployment. In light of alternative ways of abiding by the requirements of Article 17(4)(b) and (c), such as the aforementioned options of infringer suspension, community moderation and warning notices, it should be accepted that those provisions do not require automated filtering. It should also now be accepted that rightholder notifications are required to be specific not only in respect of works and other protected subject matter, but also in respect of the circle of potential infringers belonging to the audience of the content platform at issue. To the extent that filtering is employed as a way of abiding by the requirements of Article 17(4)(b) and (c), it should therefore be limited to the uploads of such limited circles. In any case, a degree of specificity is required that prevents rightholders from notifying long lists of protected works without tailoring the notification to a specific sub-group of the platform audience. Otherwise, individual notifications may reach such a volume that, adding up the total number of notified specific works, a filtering duty arises which de facto amounts to a prohibited general monitoring obligation even under laxer interpretations that seek to offer more room for filtering measures (the aforementioned risk of a snowball effect);
  • in a trade mark context, preventive content moderation duties based on notifications of specific protected subject matter, as envisaged in Article 17(4)(b) CDSMD with regard to copyright works, are most probably incompatible from the outset with freedom of competition and the principle of the free movement of goods and services in the internal market. In L’Oréal v eBay, the CJEU clarified that a monitoring obligation could not have as its object or effect a general and permanent prohibition on the selling, on that marketplace, of goods bearing a specific trade mark.[1] Otherwise, the filtering obligation would impede legitimate trading activities, such as advertising and offers for sale relating to parallel imports from other EU Member States after the exhaustion of trade mark rights. Hence, any automated content moderation system based on the notification of a specific trade mark runs the risk of encroaching upon freedom of competition and the guarantee of the free movement of goods and services in the internal market;
  • finally, defamation cases are more context-specific than copyright or trade mark cases. They differ substantially from copyright and trade mark scenarios because of the absence of holders of large right portfolios who could trigger a snowball effect by notifying long lists of protected subject matter. Filtering requests in copyright cases concern content that is fixed after publication; in trade mark cases the distinctive elements of the protected sign are fixed after registration. In defamation cases, by contrast, the legitimacy of a filtering request depends on the specific – defamatory – nature of uploaded content. Considering these substantial differences, case law in the area of defamation, such as the CJEU decision in Eva Glawischnig-Piesczek,[2] fails to provide guidance for the assessment of content moderation measures relating to trade marks or literary and artistic works.

If EU legislation intends to include a horizontal regulation of content moderation duties in the DSA, it is thus important to take differences between the scope of rights and the characteristics of infringement into account. Only measures that lead to appropriate results across all problem scenarios addressed in the DSA can serve as a basis for a global, horizontal rule.

 

[1] CJEU, C-324/09, L’Oréal v eBay International, ECLI:EU:C:2011:474, 12 July 2011, para. 140.

[2] CJEU, C-18/18, Eva Glawischnig-Piesczeck, 3 October 2019, para. 53.

News