skip to content
 

CIPIL Annual Spring Conference 2018: Intermediary Liability and Responsibility

Saturday, 10 March 2018

Chair: Mr Justice Richard Arnold

The creation of open and responsible digital markets is a major policy priority across the globe, linking critically to both economic dynamism and to protecting core societal values in a challenging socio-technological environment.  Amongst the most vexed and controversial aspects of this is the multi-faceted issue of ‘intermediary’ liability and responsibility. How active can an information society service be whilst still falling within an intermediary shield (safe harbour)? What potential liability and ongoing responsibilities (or duties of care) should such shielded intermediaries have for potential illegalities on their service?  To what extent should the answer to this question depend on the type of intermediary (e.g. host vs conduit) and/or on the type of potential illegality? These are among the questions that are confronting us within this space.

In 2016 the European Commission announced that it would seek to complement the approach taken by the e-Commerce Directive (2000/31/EC), developed in the early days of the web, with a new sectorial, problem-drive approach to regulation (page 9). This led to the adoption, as a central part of the EU’s Digital Single Market (DSM) strategy, of proposals to regulate certain online platforms in two key areas: those of copyright and child protection and hate speech. The copyright proposal advocates the application of filtering/blocking mechanisms as routine and mandatory measures. The child protection and hate speech proposal places emphasis on platforms proactively adopting a range of measures to manage content, including through their terms and conditions, age verification and reporting/flagging systems. The legislative progress of both proposals is now well advanced. Meanwhile, the soon to be in force General Data Protection Regulation (2016/679) addresses the relationship between the intermediary shields and the responsibilities of both controllers and processors of personal data to safeguard personal information – an aspect of law that has received greater attention following the ground-breaking C-131/12 Google Spain judgment of the Court of Justice on the ‘right to be forgotten’. Finally, thinking in this area has been affected by the case law of the European Court of Human Rights, notably the Grand Chamber judgment of Delfi (2015), which specifically explored the responsibilities of online news platforms for defamatory and hate speech material that interfered with an individual’s right to a private life, as well as the follow-up judgments of MTE v Hungary (2016) and Pihl v Sweden (2017).

The CIPIL Annual Spring Conference 2018 provided a unique opportunity to explore where we are in this broad and important area, as well as where we might be going in the future.  The morning session took stock of current law and debate on intermediary liability and responsibility in each of the substantive areas falling under the DSM, whilst also raising critical overarching questions. The afternoon sessions focused on specific cross-cutting themes: (i) what should be the reach of notice-based remedies in this area and, in particular, when (if at all) should these extend to filtering/blocking; (ii) should some intermediaries have proactive obligations to respond to illegality; and (iii) what new thinking might be fruitful here, especially for the UK, given the likelihood of an imminent Brexit.

Programme

10:00-11:15 – Session One: Taking stock – Where are we now?

This introductory session looked at where we currently stand with regard to intermediary liability and responsibility in the core substantive areas of the DSM. Speakers looked variously at this issue vis-à-vis copyright and other forms of intellectual property, hate speech, children protection and personal data protection.

Speakers:

Martin Senftleben (Free University Amsterdam (VU)) (Audio / Slides)

Frederik Borgesius (Free University Brussels) (AudioSlides)

Lorna Woods (University of Essex) (Audio / Slides)

11:40-12:30 – Session Two: Passive, active, publishers, intermediaries?

Internet platforms such as Google and Facebook tend to characterise themselves as covered by intermediary shields, including those set out in the EU’s e-Commerce Directive. On the other hand, the holders of copyright and related rights, as well as spokespersons for the traditional media have increasingly argued that such platforms engage in enough control over content that they should acquire publisher responsibilities. Digital rights groups and internet scholars fall on both sides of the debate. How should this debate be resolved? Is one of these claims clearly false or is there a need for a new synthesis of these different perspectives?

Speakers:

David Erdos (CIPIL, University of Cambridge) (Audio / Slides)

Martin Husovec (Tilburg University) (Audio / Slides)

13:30-14:45 – Session Three: Notice-based remedies for illegality

This session looked horizontally ex post notice-based remedies that are and/or should be available in the case of alleged illegality. Many critical questions arise here. Who should be qualified to give notice – a court, an administrative authority, a claimant or anyone, including a third party? To what extent should those subject to such notice be obliged to investigate the validity of the claims? Finally, what should be the reach of any response to a bona fide claim of illegality? Should this be restricted to the takedown of the specific content or should it also extent to the deployment of content recognition technologies for the blocking of the relevant content?

Speakers:

Przemysław Polanski (Kozminski University) (Audio / Slides)

Jaani Riordan (8 New Square) (Audio / Slides)

Hugh Tomlinson (Matrix Chambers) (Audio)

15:10-16:00 – Session Four: Proactive obligations for intermediary platforms

Whilst most debate and discussion has focused on the responsibilities of intermediaries after notice, the European Court of Human Rights in Delfi suggested that some platforms might be expected to proactively monitor and remove content that poses a particular threat to the enjoyment of human rights even before receiving such notification. Much less controversially, the European Commission’s proposal on child protection and hate speech has suggested that audiovisual platforms should be expected to adopt more limited ex ante measures, such as the implementation of appropriate terms and conditions, flagging mechanisms and age verification systems. This session explored the concept of proactive obligations and sought to delineate what role they may legitimately play in this area.

Speakers:

Mark Bunting (Communications Chambers/Oxford Internet Institute) (Audio / Transcript)

Daithi Mac Sithigh (Queen’s University Belfast) (Audio / Slides)

16:15-17:30 – Session Five: Where are we going?

This final session took a more “blue-sky” approach to this topic, both as a whole and specifically in relation to the UK. If an intermediary liability system were to be devised now from scratch what ought it to look like? Are there alternatives to the focus in Europe on actor liability and responsibility for illegality which should be considered? Moreover, given the likely impending Brexit, should the UK seek a different model of regulation in this area separate from that of the DSM?

Speakers:

Christina Angelopoulos (CIPIL, University of Cambridge) (Audio / Slides / Article)

Matthias Leistner (LMU Munich) (Audio / Slides)

Nicolo Zingales (University of Sussex) (Audio / Slides)

Photographs

CIPIL Spring Conference 2018