Skip to main content

Blog

21 for 2021: Notice-and-Takedown in Copyright Intermediary Liability

Posted on    by Admin
Blog

21 for 2021: Notice-and-Takedown in Copyright Intermediary Liability

By 25 June 2021No Comments

This post is part of a series of evidence summaries for the 21 for 2021 project, a CREATe project within the AHRC Creative Industries Policy and Evidence Centre (PEC). The 21 for 2021 project offers a synthesis of empirical evidence catalogued on the Copyright Evidence Portal, answering 21 topical copyright questions for the 21st century.  In this post, Kristofer Erickson (Associate Professor in Media and Communication, University of Leeds) and Martin Kretschmer (Professor of Intellectual Property Law and Director of CREATe) review the empirical evidence relating to the notice-and-takedown system.

 

Introduction

The emergence of Internet services in the mid-1990s prompted legal consideration of the extent to which online service providers should be held liable for potentially infringing activity by their users. As more and more citizens created, uploaded and shared content, it became clear to regulators that legal intervention was required to partition the burden of responsibility for preventing illegal copying while encouraging service innovation and the circulation of legal content. Over two decades from 1998 to 2019, the dominant paradigm for intermediary online copyright liability was a system of notification of infringement by rightholders and removal by online service operators once knowledge about infringing material was obtained (so-called “notice-and-takedown”). By providing safe harbour to internet services from liability for copyright infringement by their users, the notice-and-takedown mechanism shares the costs of detection and enforcement between rightholders and service providers.  Following the adoption of the EU Copyright in the Digital Single Market Directive (Art. 17) in 2019, and the proposal of an EU Digital Services Act in 2020, there has been a potential paradigm shift in intermediary liability, moving from an obligation to act upon knowledge obtained to an obligation to prevent unauthorized content appearing in the first place. In the context of these policy developments, it is important to consider the empirical evidence regarding operation of the notice-and-takedown system after its introduction with the US Digital Millennium Copyright Act (DMCA) in 1998 and (in a similar form) with the EU E-Commerce Directive in 2000.

Debates and recent evolutions

The main legal and policy debates around Notice-and-Takedown procedures concern 1) the burden of responsibility for locating and removing infringing content and 2) the possible effects of over-enforcement on other rights such as Freedom of Expression. The main stakeholders in debates have been users, platform operators, and content rightholders. Advocates for online users have been concerned by the relatively weak position of individual uploaders, the lack of effective counter notification mechanisms, potential chilling effects of copyright takedown, and the possibility that new forms of creativity will be supressed, particularly by automated forms of takedown.

Rightholder groups have been concerned with the cost of identifying infringing content and notifying platform operators, as well as the possibility that such notifications are ineffective, or that they fail to result in “stay-down” of content. Rightholders have complained about the lack of licensing revenue (“value gap”), and the costs of locating and reporting instances of infringement.

Platform operators are similarly concerned about the cost of processing large volumes of takedown requests, the quality and accuracy of those requests, and the potential of over-enforcement to limit the legitimate activity of their users. Very large platform operators such as YouTube have developed automated screening and matching technologies to confront the large volume of potentially infringing uploads.

While the notice-and-takedown regime has its origins in US Law, it has become de facto a global standard. According to Google’s Transparency Report (visited 9 June 2021), more than 5 billion delisting requests due to copyright were received since reporting started. They are processed under DMCA formalities, regardless of whether the country in which the request was filed prescribes these formalities or had any safe harbour laws at all.

For example, Google says:

“It is our policy to respond to clear and specific notices of alleged copyright infringement. The form of notice we specify in our web form is consistent with the Digital Millennium Copyright Act (DMCA) and provides a simple and efficient mechanism for copyright owners from countries around the world.”

“To initiate the process to remove content from Search results, a copyright owner who believes a URL points to infringing content sends us a take-down notice for that allegedly infringing material. When we receive a valid take-down notice, our teams carefully review it for completeness and check for other problems. If the notice is complete and we find no other issues, we remove the URL from Search results.”

When the safe harbour (no liability until formal notification) of section 512 DMCA was introduced it was with the aim to support innovation by Internet services. The emergence of UGC business models, relying for example on social media photo-sharing, seems to suggest that it was effective in that aim.  A large body of empirical scholarship has examined the actual effects of notice-and-takedown mechanisms for platform operators, rightholders, and users.

Drawing on the research catalogued on the Copyright Evidence Wiki, this short blog reviews:

  • What does the evidence say on the effects of the operation of a notice-and-takedown regime over two decades?
  • What trends (and gaps in our knowledge) are revealed by the way that empirical research has approached the question of online intermediary liability?

Existing evidence and research agendas

Review technique

There were 35 studies catalogued on the Copyright Evidence Wiki (as of 9 June 2021) referencing the term “Takedown”, of which 14 signal Takedown already in the title of the publication.  Using an orthodox literature review strategy, we inductively grouped the empirical findings of these 31 studies into five different policy issues addressed. The following table gives an overview of this classification. The reminder of the blog explains the empirical findings relating to each of these five issues which emerged from our review of the literature.

Volume of Takedown notices

The growing volume of takedown notices has emerged as a research and policy issue because it relates to the costs that participants must bear. If platforms receive large numbers of takedown requests, they will have to expend resources to investigate each instance of potential infringement. In their 2006 study of notice-and-takedown, Urban and Quilter found that Google had received only 734 notices between March 2002 (when the Chilling Effects database started collecting Google reports) and August 2005, the cut-off date of their study. A massive increase in volume of ‘robo-notices’ emerged since 2009, mainly sent by large rightholders or their agents, requiring technical automation by online services, in order to receive and respond without incurring liability.

Source: Seng (2014)

A general observation from this research is that the notice-and-takedown processes can, at the very least, be considered successful in terms of uptake and use by rightholders and online service providers, with the safe harbour it provides to internet intermediaries viewed as important for commercial innovation. Platforms for their part have invested significantly in algorithmic and other detection mechanisms to lessen the burden of high volumes of takedown requests.

Accuracy of Notices

Some early studies of the notice and takedown regime were concerned with the potential that errors in the process could lead to over-removal of legitimate expression. This may be exacerbated by the presence of third-party firms in the business of detecting and issuing takedown notices on behalf of rightsholders. Indeed, this appears to have led to concentration: research has found that a small number of issuers are responsible for a disproportionate amount of takedown requests. Bar-Ziv and Elkin-Koren (2018) found 65% of their sample of notices were sent by a single entity. The dramatic increase in the volume of takedown requests initially amplified the problem of accuracy. However, it appears that accuracy has been improved since automated systems were introduced, such as Google’s preferential ‘Trusted Copyright Removal Program’, providing structured web forms.

So-called ‘Robo-notices’ which may be generated in large numbers by third party enforcement agencies not closely tied to rightholders, can introduce and amplify errors affecting significant quantities of works. See Joe Karaganis and Jennifer Urban (2015).

Over-enforcement and abuse

Over-enforcement occurs when non-infringing material is removed, e.g. because content is erroneously identified (false positive), or because the sender or receiver of a notice have not sufficiently considered copyright exceptions. Ahlert et al (2004) created simulated web pages containing non-infringing content, and then sent notices to service providers asking for it to be removed.  In nearly all cases, the material was removed. This suggests a potential for abuse by unscrupulous actors who could use copyright claims to censor non-infringing content without scrutiny by online service operators.

Erickson & Kretschmer (2018) analysed the overall takedown rate of a sample of 1,839 music video parodies uploaded in 2012. As transformative works, many of these videos should have benefitted from specific exceptions to copyright. Over a 4-year period, 40.8% of videos were found to be removed, with 32.9% of takedowns attributable to copyright notices. Jacques et al. (2018) attributed 32.1% of the takedowns to algorithmic takedown, and 6.4% to manual takedown. This evidence points to worrying implications of automation for both sending and processing notices.

Due process and transparency

Due process refers to the procedural accuracy of notice-and-takedown measures implemented by rightholders and online service providers, and the ability of users to be informed about the status of their uploaded content. An experimental study by Fiala and Husovec (2018) examined the economic incentives driving over-removal of content and under-use of the counter-notification mechanism by users. In baseline condition, providers tend to over-enforce, and creators tend not to dispute decisions.  Alternative dispute resolution (ADR) treatment resulted in fewer mistakes by providers and a more profitable condition for creators overall.

Perel and Elkin-Koren (2017) advocate “black box tinkering” as a method to uncover the hidden functionality of algorithms and hold them accountable. Internet companies have not readily shared information with researchers about how they handle takedown requests, likely because they are wary of increased scrutiny from regulators, or because they view filtering technology are a source of competitive advantage.

Additional opportunity for new empirical research may arise from increased reporting requirements (e.g. NetzDG 2017 Germany; EU CDSM Art. 17 Implementation). Transparency in algorithmic governance has become a major concern of academic research across a range of fields, including Intellectual Property law. Researchers in neighbouring fields are developing methods to detect and characterise the potential for bias in algorithmic decision-making, across a range of platforms including creative content sharing sites.

Balancing obligations and costs

A growing body of research has considered the economic welfare effects of copyright removal policies for society as well as the specific costs borne by different stakeholders. Heald (2015) proposes that notice-and-takedown regimes, in tandem with automatic detection systems, may create an efficient market for previously-unavailable works. Heald found high rates of availability of older in-copyright works (77% on YouTube) and public domain copyright songs from 1919-1926 (with 75% on YouTube). This rate is high compared to other mediums such as books, where only 27% of New York Times bestsellers from same period remain available. It suggests that rightholders may choose not to enforce copyright after a work is uploaded, even if they would not invest in making the work available on their own.

Changes to intermediary liability may introduce questions of competition and market dominance. Urban, Karaganis and Schofield (2017) found that algorithmic “DMCA plus” handling techniques might be a source of competitive advantage for large incumbent platforms, limiting entry by smaller competitors.  If greater burdens are placed on emerging Internet platforms to detect, remove and keep down infringing content, this may favour larger incumbent businesses.

Future directions for research

Overall, the evidence base shows that rightholders have made effective use of the notice-and-takedown system to enforce their copyrights over the 20-year period since the regime was introduced, dramatically accelerating with the use of automated systems since about 2012.

Limitations

Many studies have used data from the semi-public Chilling Effects / Lumen database. This is an important resource, but covers a limited number of large American Internet companies. Alternatively, while researchers working on their own can attempt to reverse engineer (‘tinker’) in order to deduce algorithmic priorities, this is a slow and resource intensive process.  More transparent sources of data are needed. Reporting obligations need to be built into newly introduced regimes regulating platforms so that such data become available.

Future research

Past behaviour in response to balancing obligations and risks under the notice-and-takedown liability regime suggests what to look out for in researching coming policy changes. Is there potential for over-enforcement? Are there counter-notice provisions, and are they used effectively by UGC creators? Is market entry by smaller firms affected? What is happening to the availability of materials that are no longer commercially available (digitally ‘out-of-print’)? Finally, how might changing the liability and incentives structure of notice-and-takedown alter the market for the provision and consumption of cultural content more widely?

Editorial note: This entry was drafted by Kristofer Erickson and Martin Kretschmer, drawing on their contribution to the Oxford Handbook of Online Intermediary Liability (ed. G. Frosio, 2020).