Artificial (Un)Intelligence: Automated Tools and Intermediary Liability

By Shivani Kabra

The exponential increase of user generated content on online platforms has eventuated in a proliferate system of digital piracy and unlawful content. These instances raise questions of attaching secondary liability to the platforms hosting such materials. Predominantly, online platforms act as intermediaries for facilitating user interaction and user generated content. Recent legislative developments in India regarding intermediary’s liability require intermediaries to take proactive steps towards regulating the digital space. Intermediaries are now obligated to employ automated technology-based tools for enforcing third party rights by detecting illegal/unlawful content. As a result, the tools facilitate the creation of a parallel legal system that necessitate specific socio- fiscal costs for the society at large. A deeper examination of these costs is required to determine the efficacy of the legislative developments enabling their enactment.

I. Introduction

Technological developments that impact societal trends necessitate corresponding changes in law.[1] The advent and growth of social media platforms such as TikTok, YouTube, Facebook, and Instagram have dramatically increased user interactions on internet.[2] An increase in user interactions has simultaneously created a wealth of user generated content (“UGC”) freely available for public consumption.[3] Perhaps the most accurate reflection of today’s digital culture is reflected in TikTok’s user base.[4] Reportedly, TikTok has been downloaded more than 750 million times in 2019,[5] and is the most downloaded application of 2020[6] – encompassing millions of users creating publicly available UGC.

The primary concern with UGC on social media platforms is of attributing liability for potential instances of infringement and illegality.[7] Often, UGC are created, based upon, or influenced by pre-existing content that may already be protected under intellectual property laws.[8] To illustrate this point, one only needs to assess the genesis of the disputes concerning the Harlem Shake memes.[9] In 2012, Harry Rodriguez (DJ Baauer) had originally composed a track labelled, ‘Harlem Shake.’[10] Thereafter, multiple third-party remixes of the Harlem Shake song were uploaded on YouTube in 2013.[11]Despite the 2012 Harlem Shake song borrowing elements from pre-existing works (specifically the 1900s dance style known as “Harlem Shake”), the subsequent remixed content were disputed as infringements and accordingly monetized by the rights-holder.[12] This situation however becomes murkier when the question of liability arises for YouTube.[13]Since the infringing remixed UGC were uploaded on the platform, should the burden of liability for claims of digital piracy be shared indirectly with the platform?

This paper aims to contribute towards this debate by assessing the extant frameworks concerning intermediary’s liabilities within Indian and comparative jurisprudence. In doing so, the paper attempts to analyse the efficacy of applicable laws while specifically focusing on the merits of using technological tools to combat digital piracy. At the advent, it is necessary to understand the nature and categories of liability, broadly, – primary  and secondary liability.[14] Primary liability entails that individuals be responsible and liable for their own actions.[15] In contrast, secondary liability refers to instances of persons being responsible and liable for another’s actions.[16] Contextually, YouTube’s liability for infringing Harlem Shake remixes falls within the ambit of secondary liability such that, YouTube is liable for the actions (as well as the UGC) of its users.[17] Accordingly, the scope of this paper is limited to understanding intermediary’s liability for copyright infringement on account of UGC uploaded on their platforms.

II. Intermediary Liability for Copyright Infringement in India

An understanding of intermediary’s liability requires one to comprehend the scope and contours of the term “intermediary.”[18] Section 2(w) of the Information Technology Act, 2000 (“IT Act”) defines an intermediary as “any person who on behalf of another person, stores or transmits electronic records or provides services with respect to that record.”[19] Considering the said definition, it is reasonable to assume, prima facie, that TikTok, YouTube, Facebook and other such social media platforms involved with uploading and sharing of UGC will be considered to be an intermediary under Indian laws.[20]

For a better understanding of the definitional ambit under Section 2(w), reliance is placed on two cases.[21] In Myspace Inc v. Super Cassettes Industries[22] (“Myspace Inc Case”), Myspace was considered to be an intermediary under Section 2(w) because it was a neutral platform that did not add, modify, or contribute any information of its own towards the UGC on its platform.[23] Likewise, in another case, the platform of Unacademy was excluded by courts from the definitional ambit of ‘intermediary’ on the grounds that – (i) UGC was created using Unacademy’s software, (ii) UGC was published only after the approval of Unacademy, and (iii) Unacademy had the authority to reject, edit, modify, or change UGC.[24]Accordingly, the extensive influence and control of Unacademy over UGC made it a non-neutral platform.[25] Therefore, the determinative factor for identifying a platform as an intermediary under Indian laws is the extent of editorial control exercised by the platform over their UGC.[26]

Further, the standard and extent of secondary liability for intermediaries is encompassed within Section 79 of the IT Act.[27] Section 79 provides a safe harbour clause for intermediaries thereby exempting them from (secondary) liability arising out of third party information or actions (namely, liability arising from infringing UGC), subject to compliance with certain due diligence obligations.[28]

However, the safe harbour exception is limited by the application of Section 51 and 52 of the Copyright Act, 1956. As per Section 52(1)(c), intermediaries are not “responsible for secondary liability unless they are aware or have reasonable grounds for believing that they are storing an infringing copy [i.e. UGC].”[29] Similarly, Section 51(a)(ii) precludes secondary liability “if the person had no knowledge or reason to believe that a work was an infringement.”[30] Therefore, it can be concluded that for an intermediary to incur secondary liability, they must necessarily have some knowledge, awareness, or reasonable belief that the content being uploaded or shared on their platforms amounts to digital piracy/ infringement or unlawful content.[31] This has been reaffirmed by the decision in the Myspace Inc case, wherein an intermediary was held as liable for content on its platform only when (i) they had actual or specific knowledge of the infringing content and (ii) did not take necessary steps to remove such infringing content.[32]

While the pre-requisite of ‘knowledge’ has been qualified under law as ‘reasonable and specific awareness,’ the theoretical determination of ‘necessary steps for removing digital piracy’ needs to be discussed further.[33] This topic is especially relevant in light of the recent legislative developments within the Indian space.[34] Prior to 2021, the Information Technology (Intermediaries Guidelines) Rules, 2011 excluded intermediaries from secondary liability “if they did not knowingly host, publish or transmit infringing information.”[35] This exception was applicable only in the absence of any editorial control employed by intermediaries over UGC.[36] The introduction of the draft Information Technology (Intermediaries Guideline) Amendment Rules, 2018[37] (“2018 Draft Rules”) was the first step in attempting to quantify the extent of secondary liability imposed upon intermediaries.[38] One of the most prominent obligations under the 2018 Draft Rules required intermediaries to “deploy technology based automated tools for proactively identifying and removing unlawful information or content”[39] [including infringing UGC] on their platform.[40]

Interestingly, the recent notification of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021[41] (“2021 Rules”) provides further guidance on the necessary steps required from intermediaries in combating illegal UGC.[42] Similar to the 2018 Draft Rules, the 2021 Rules obligate intermediaries to observe certain standards of due diligence – in absence of which they will not be able to claim benefits of the safe harbour exception.[43]. For instance, intermediaries are required to (i) inform users of its policies and regulations,[44] and (ii) report cyber security incidents to Indian Computer Emergency Response Team[45]. Notably though, the 2021 Rules differ from the 2018 Draft Rules insofar as technology based automated tools are only required from significant social media intermediary (as defined in the 2021 Rules) for detecting certain kinds of unlawful UGC (as define din the 2021 Rules) but not including infringing UGC.[46]

However, regardless of the provisions of 2021 Rules, it is interesting to note that intermediaries in India have already commenced using technology based automated tools for detecting infringing UGC in order to off-set their liability pursuant to IT Act and Copyright Act, 1957.[47]  Thus, while further developments are awaited on this subject, it becomes important to understand the functioning and implications of these tools for the Indian jurisprudence and industries, in order to better mould the Indian policies.[48] So far as the economic costs are concerned, building and enacting such tools necessitates complete business overhauls and immense investments.[49] For example, Google has expended upwards of USD 100 million in building and implementing its automated enforcement tool – ‘Content ID.’[50] Yet, the more pressing concern that arises relates to the potential social costs of these tools – for fiscal costs may vary across companies though the social costs will remain largely standard due to the common components[51] that these tools share.

 III. Technology-Based Automated Enforcement Tools

To date, automated enforcement tools in mainstream prominence have been largely governed by the laws of the United States.[52] The overarching law regarding intermediary liability in the United States has been laid down in the Digital Millennium Copyright Act, 1998 (“DMCA”).[53] Section 512(c) of the DMCA attaches secondary liability for infringing UGC against intermediaries “if they are aware of the presence of the infringing material or upon obtaining such knowledge, do not act expeditiously to remove or disable such infringing content.”[54] Notably, while the DMCA requires intermediaries to take steps towards removing infringing UGC upon obtaining knowledge of the same, it does not require the employment of automated enforcement tools.[55] Nevertheless, regardless of the clear absence of an obligation to this effect, certain platforms, such as YouTube and Facebook, have been employing automated enforcement tools for effective identification of illegal (and infringing) UGC.[56]

1. Technology-Based Automated Tools Employed by Youtube and Facebook

The primary aim of Content ID is to identify infringing UGC that are uploaded on YouTube, in order to disclaim YouTube’s secondary liability for such infringing content.[57] Content ID has been built with the objective of aiding content owners to detect infringing copies of their work on YouTube.[58] In order to work, content owners have to submit their copyright protected work on YouTube.[59] Thereafter, Content ID creates ‘fingerprints’ of submitted works in its database.[60] These fingerprints are scanned against works uploaded on YouTube for determining if there is a match, copy, or imitation of the original work submitted to YouTube database.[61] If such a match is detected, the copyright owner has the option to block, monitor, or monetize[62] the infringing UGC.[63] Subsequently, the alleged infringer too has the option of disputing the claim of infringement within the limits of Content ID.[64]

Similar to YouTube, Facebook too uses an automatic technology based tool known as, ‘Rights Manager.’[65] Rights Manager works by detecting matching audio and video content based on rules and conditions (“Match Rules”) set by the rights holder.[66] The rights holder (subscriber of the tool/ copyright owner) uploads reference files and indicates whether they own the rights to the video, audio, or both.[67] Thereafter, Rights Manager finds matches and applies Match Rules such as blocking, monitoring, or monetizing of the infringing content.[68] Similar to Content ID, Rights Manager also allows the alleged infringer to dispute claims as per the procedure established within Rights Manager.[69]

An analysis of both the tools highlights the following key common features: (i) they act as an ‘upload filter’ for UGC uploaded on their respective platform,[70] (ii) they involve automatic filtering that precludes manual or human review of ‘detected matches’ at the first instance,[71] (iii) they allow the copyright holder/rights holder to claim ad earnings from the allegedly infringing UGC through the option of monetization,[72] (iv) they have an internal dispute resolution system for claims made via the tools,[73] and (v) they are universally applicable across territorial borders. [74] As a result, the tools create a distinct legal environment with specific social costs that are discussed in the next section of this paper.[75]

2. Assessing the Social Cost of Automated Tools: Conduits of Control Mechanism

Considering the key features of the tools conceptualized by YouTube and Facebook (and by default Instagram),[76] it is easy to recognize the ability of automated tools to function as an independent control mechanism for copyright infringement on their respective platforms. For the same, the paper analyses automated tools from a threefold socio-legal perspective of (i) limited analytical capacity of algorithms, (ii) universality of the tools, and (iii) birth of a parallel dispute resolution system.[77]

First, the automated tools do not consider legal exceptions to copyright infringement such as the fair use doctrine in USA or any of the exceptions under Section 52 of the Copyright Act.[78] Instead, the rights holders are required to determine the applicability of such exceptions prior to filing a claim for infringement.[79] The primary rationale behind absence of consideration is the diminished ability of automated content identification systems to distinguish fair use or statutorily exempted use from actual infringement.[80] In legal frameworks, interpretations of such exceptions to infringement are largely subjective in nature and left to the discretion of the judiciary on a case to case basis.[81] The contextual and dynamic understanding of the exceptions thus renders them outside the purview of a machine’s or an algorithm’s ability of discernment.[82]

Second, the universal application of automated tools across territorial borders fails to consider lex loci copyright laws and interpretations, i.e. laws of the country in which the transaction is performed.[83] Disregarding lex loci laws in favour of the interpretations posited by the automated tools occurs on two grounds: (i) standard of copyright protection and (ii) standard of infringement.[84]

Copyright is a statutory right bound within the contours of each country’s legislations.[85] Accordingly, the standard of copyright protection provided for a certain work varies across countries.[86] For illustration, United States’ laws require all forms of protectible work to be fixed in a stable and permanent medium[87] while such condition of fixation is only expressly present within the Indian law for “dramatic works.”[88] In contrast, automated tools operate under the presumption that all uploaded content, such as videos or sound recordings, are protected under copyright law.[89] This attribution of protection for certain works by the automated tools happens in absence of any verification regarding the lex loci understanding of subject matter categories of works or subject matter eligibility/qualifications.[90] For illustration, the definitional understanding of ‘dramatic works’ under the US laws requires such works to convey ‘a story or theme through a series of dramatic situations’[91] whereas no such limitation is imposed on the understanding of ‘dramatic works’ under Indian laws.[92] Further, the Indian law specifically disavows copyright protection for ad libitum works,[93] while the automated enforcement system presumes protection for such works on its platforms.[94]

In addition, the standard of infringement under varied laws is ordinarily understood to be that of substantial similarity,[95]subject only to certain exceptions such as the ‘de minimis’ rule.[96] The de minimis rule qualitatively and quantitatively assesses the size and extent of copying and excludes insubstantial copying from the ambit of ‘infringement.’[97]Contrarily, automated tools are not equipped to conduct a qualitative or quantitative assessment of the infringing content against the entire uploaded content as per the varying requirements of lex loci laws.[98] By disregarding whether a work is capable of being protected under the subject matter categories and qualifications of lex loci, or if an instance of copying sufficiently satisfies the understanding of infringement under lex loci, automated tools put forward a harmonized and alternative understanding of copyright.[99]

Third, the tools create an internal and distinctly separate system of dispute resolution while providing rights holder with a definitive set of remedies in the form of blocking, monitoring, or monetizing the infringing UGC.[100] Selection of remedies for instances of detected infringement is left to the complete discretion of the rights holders.[101] The discretionary usage in turn facilitates differential treatment for similar instances of infringement.[102] Further, the list of remedies and steps for dispute resolution act as a comprehensive alternative to the traditional legal system under lex loci.[103] Besides the predetermined methods, rights holders or alleged infringers are not allowed to seek other forms of resolution through the automated tools.[104]

On basis of the above, it is recognized that automated tools not only reflect the intermediary’s perception of a controlled setting for identifying and resolving infringement disputes but also promote the conception of a parallel copyright system to that of lex loci.[105] Customarily, the key components of legal systems provide for its defined scope and purpose, instances of contravention, and resolution mechanisms.[106] These key components are satisfied under automated enforcement systems through their independent and separate procedures for submitting infringement claims and appeals, along with the standardized understanding of their subject matter and scope i.e. ‘protected works’ and ‘infringement’.[107]

The resultant effect that thus ensue ensures that intermediaries not only act as law makers but also as adjudicators for disputes.[108] Accordingly, the introduction of automated tools within the Indian legal framework facilitates possibilities of a parallel legal system that overlooks Indian laws in support of institutionalized interpretations and understandings put forward by corporations (i.e., intermediaries).[109] Interestingly, the Indian courts in Shreya Singhal v. Union of India had previously limited the rights of intermediaries to remove UGC in absence of a court order or notification.[110] In furtherance of which, courts have also held that the IT rules should not be deemed to “vest in intermediary(ies) suo motupowers to detect and refuse hosting of infringing contents.”[111] The rationale for the same is grounded in preventing the implicit sanctioning of intermediaries as assessors and adjudicators for instances of alleged contraventions.[112]

I. Conclusion

The advent of technology based automated tools by intermediaries creates a new paradigm shift for identifying unlawful UGC.[113] Traditionally, the impetus for resolving disputes have vested with the judiciary and the legislature.[114]However, with the inception of automated tools, the impetus has now shifted to allow adjudicatory-esque powers to the intermediaries.[115]

The current developments in India accurately reflect the shift in paradigm from an existing traditional understanding of dispute resolution towards the newer evolving approach of automated enforcement tools.[116] In pursuance of which, the paper has dwelled upon the socio-legal consequences of implementing such tools.[117] The 2021 Rules create a unique fork road in respect of the evolution of secondary liability for intermediaries within the Indian jurisprudence.[118] With multiple intermediaries across varied industries engaging technology-based automated tools, the creation of a parallel legal system has become more than a tentative possibility,[119] limiting user ability, creativity, and interaction within the said parallel, predefined, and predetermined processes of enforcement tools.[120] It is now more than pertinent for Indian and foreign jurisprudences alike to take into consideration the socio-legal consequences associated with automated tools (as highlighted in this paper), prior to policy enactments and sanctioning of such tools.[121]

[1] David Friedman, Does Technology Require New Law, 25 Harv. J. L. & Pub. Pol’y 71 (2001 – 2002).

[2] Dr. Jochen M. Schaefer, IP Infringement Online: The Dark Side of Digital, WIPO Magazine, (Sept. 27, 2021, 4:08 PM),

[3] Id.

[4] See infra note 5 and note 6.

[5] TikTok’s Silly Clips Raise Some Serious Questions, The Economist (Sept. 27, 2021, 4:08 PM),

[6] TikTok is the Most Downloaded App of 2020 in the World, Times of India (Sept. 27, 2021, 4:08 PM),

[7] Liability of Online Platforms Study for Panel for the Future of Science and Technology, at p. 1 (Oct. 3, 2021),

[8] See European Commission, 2013 Public Consultation on the Review of the EU Copyright Rules, 2014, (last visited Sept. 27, 2021) (explaining derivatives on intermediary platforms).

[9] Michael Soha & Zachary J. McDowell, Monetizing a Meme: YouTube, Content ID, and the Harlem Shake, Social Media & Soc’y, 3 – 4, 9 (Jan. 12, 2016),  

[10] Zachary McDowell & Mike Soha, Monetizing a Meme: A Case Study on the Harlem Shake, Culture Digitally (Oct. 24, 2014),

[11] Id.   

[12]  Michael, supra note 9 at p. 5, 9.

[13] See also Andrea, Francesca & Nicoleta-Angela supra note 7 at p.1 (explaining the debate regarding intermediary’s responsibility for infringing UGC).

[14] Graeme B. Dinwoodie, A Comparative Analysis of the Secondary Liability of Online Service Providers, Oxford Legal Studies Research Paper No. 47/2017, 1 (2017),

[15] Id. at p. 1.

[16] Id.

[17] Compare Soha & McDowell, supra note 9, at 7 (explaining liability of YouTube for hosting infringing UGC) with infra note 19 (defining ‘intermediaries’).

[18] See infra note 28 (Section 79 absolves intermediaries from secondary liability in certain circumstances. To understand the scope and applicability of Section 79, it is important to understand its subject matter (i.e. intermediaries)).

[19] The Information Technology Act, 2000, § 2(w) (India).

[20] See supra note 17.

[21] See infra note 22 and note 24.

[22] Myspace Inc v. Super Cassettes Industries 236 (2017) DLT 478 (India).

[23] Id. at ¶43.

[24] Fermat Education v. Sorting Hat Technology,  O.A. No. 502 of 2018, ¶33, ¶36 (India).

[25] Id. at ¶45.

[26] See generally Myspace Inc v. Super Cassettes Industries 236 (2017) DLT 478 (India) (defining intermediaries as a neutral platform that does not exercise any control over UGC); Fermat Education v. Sorting Hat Technology,  O.A. No. 502 of 2018 (India) (listing down factors for interpreting definitional scope of ‘intermediaries’). See also The Information Technology (Intermediaries Guidelines) Rules, 2011, Rule 3(3), G.S.R. 314(E), Rules of Parliament, 2011 (India) ( requiring intermediaries to have specific intent / knowledge of infringing content on their platform)

[27] The Information Technology Act, 2000, § 79 (India).

[28] Id.

[29] The Copyright Act, 1957, § 52(1)(c) (India).

[30] The Copyright Act, 1957, § 51(a)(ii) (India).

[31] Intermediary Liability 2.0: A Shifting Paradigm, Software Freedom law Center, (Mar. 2019),

[32] Myspace Inc v. Super Cassettes Industries 236 (2017) DLT 478, ¶38 (India).

[33] Id.

[34] Compare infra note 35 (requiring intermediaries to have specific intent / knowledge of infringing content on their platform), with infra note 39 (mandating intermediaries to employ automated technology tools as part of their due diligence obligations), and infra note 43 (mandating intermediaries to employ automated technology tools).

[35] The Information Technology (Intermediaries Guidelines) Rules, 2011, Rule 3 (3), G.S.R. 314(E), Rules of Parliament, 2011 (India).

[36] Myspace Inc v. Super Cassettes Industries 236 (2017) DLT478 (India); Fermat Education v. Sorting Hat Technology,  O.A. No. 502 of 2018 (India).

[37] Information Technology [Intermediaries Guidelines (Amendment) Rules (2018),

[38] Id. at Rule 3.

[39] Id. at Rule 3(9).

[40] Raghav Mendiratta & Joan Barata, Proposed Rules to Amend the Information Technology (Intermediaries Guidelines) Rules, 2011, World Intermediary Liability Map (Sept. 29, 2021, 01:10 PM),

[41] The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, G.S.R. 139(E), Rules of Parliament, 2021 (India) .

[42] Id.

[43] IT (Intermediary Guidelines Code), 2021, Rule 7; The Information Technology Act, 2000, § 79 (India).

[44] IT (Intermediary Guidelines Code), 2021,Rule 3(1) (a) – (b); IT Intermediaries Guidelines Rules, 2018, Rule 3(1) – (2).

[45] IT (Intermediary Guidelines Code), 2021, Rule 3(l); IT Intermediaries Guidelines Rules, 2018, Rule 3 (10).

[46] IT (Intermediary Guidelines Code), 2021, Rule 3(4); IT Intermediaries Guidelines Rules, 2018, Rule 3(9) (As per 2021 Rules this obligation is solely for significant social media intermediaries (a sub-classification in the definition of ‘intermediaries’). In contrast, the 2018 Draft Rules do not create such sub-classifications and require all intermediaries to abide by the rules).

[47] See infra notes 57 and 74.

[48] See infra note 109.

[49] See infra note 50.

[50] How Google fights Piracy, Google (Sept. 29, 2021, 10:22 AM),

[51] See Part III (1) of this paper.

[52] See Dispute a Content ID Claim, YouTube Help (Oct. 1, 2021, 8:30 AM),;  Requirements for Counter Notifications, YouTube Help (Oct. 1, 2021, 8:30 AM), (by submitting a counter notification, individuals submit themselves to the jurisdiction of US Federal Courts).

[53] Needs Citation Digital Millennium Copyright Act, 1998, 17 US Code, Act of Parliament, 1998 (United States).

[54] Digital Millennium Copyright Act, 1998, § 512(c), 17 US Code, Act of Parliament, 1998 (United States).

[55] Id.

[56] How Content Id Works, YouTube Help (Oct. 1, 2021, 8:30 AM),; Rights Manager, Rights Manager (Oct. 1, 2021, 8:30 AM),

[57] Katharine Trendacosta, Unfiltered: How YouTube’s Content ID Discourages Fair Use and Dictates What We See Online, Electronic Frontier Foundation (Oct. 1, 2021, 8:30 AM),

[58] How Content ID works, YouTube Help (Oct. 1, 2021, 8:30 AM),

[59] YouTube Operations Guide: Using Content ID, YouTube Help (Oct. 1, 2021, 8:30 AM),

[60] How Content ID Works, YouTube Help (Oct. 1, 2021, 8:30 AM),

[61] Jonathan Bailey, YouTube Beta testing Content ID for everyone, Plagiarism Today (May 2, 2018)

[62] Monetizing of infringing video allows rights holders to commercially exploit the infringing videos and receive the revenues accruing thereof; SeeYouTube Help, How Content ID works,

[63] YouTube Help, supra note 56.

[64] How to avoid and resolve improper claims, YouTube Help (Oct. 1, 2021, 8:30 AM),

[65] Needs Citation Rights Manager, Rights Manager (Oct. 1, 2021, 8:30 AM),

[66] About Rights Manager, Facebook (Oct. 1, 2021, 8:30 AM),

[67] Reference files in Rights Manager, Facebook (Oct. 1, 2021, 8:30 AM),

[68] Actions for matching content in Rights Manager, Facebook (Oct. 1, 2021, 8:30 AM),

[69] Resolve usage dispute in Rights Manager,  Facebook (Oct. 1, 2021, 8:30 AM),

[70] YouTube Help, supra note 56 and note 60; Facebook for Media: Rights Manager, Facebook (Oct. 1, 2021, 8:30 AM),

[71] Overview of Copyright Management Tools, YouTube Help (Oct. 1, 2021, 8:30 AM),; Facebook supra note 66.

[72] YouTube Help supra note 60; Facebook supra note 69.

[73] YouTube Help supra note 64; Facebook supra note 69.

[74] YouTube Help supra note 60; Facebook supra notes 65 – 66 (stating that there are no express limitations on jurisdictional applicability of the tools i.e., both the tools are applicable in countries that have access to YouTube and Facebook).

[75] Henning Grosse-Ruse Khan, Automated Copyright Enforcement Online: From Blocking to Monetization of User-Generated Content, University of Cambridge Faculty of Law Research Paper, No. 8/2020, Part III (Oct. 1, 2021, 8:30 AM),

[76] Protect your Instagram content with Rights Manager, Facebook (Oct. 1, 2021, 8:30 AM),

[77] infra note 82 and note 100; supra note 74.

[78] The Copyright Act, 1957, § 52, No. 14, Acts of Parliament, 1957(India).

[79] See Frequently Asked Questions About Fair Use, YouTube Help (Oct. 1, 2021, 8:30 AM),

[80] Id.

[81]Copyright & Fair Use, Stanford Libraries (Oct. 2, 10:05 AM),

[82] Lenz v. Universal Music Corp., 815 F.3d 1145, 1154 (9th Cir. 2016) (United States); See Matthew Sag, Internet Safe Harbors and the Transformation of Copyright Law, 93(2) Notre Dame Law Review 518 (2017), at 533.

[83] Henning supra note 75 at fn. 32 and 33.

[84] See Henning supra note 75 at fn. 32 and 33; Supra note 81.   

[85] The Copyright Act, 1957, § 16, No. 14, Acts of Parliament, 1957 (India); The Chancellor, Masters & Scholars of the University of Oxford & Ors. v. Rameshwari Photocopy Services, 233 (2016) DLT 279, ¶26, ¶28 ¶80 (India).

[86] Compare 87 (generally elaborating upon the standard of ‘fixation’ required under international and foreign jurisprudences), with 88 (defining ‘dramatic works’ under Indian copyright laws).

[87] See Module 3: The Scope of Copyright Law, Berkman Klein Center for Internet & Society, section 4.3, (Oct. 2, 10:05 AM),

[88] The Copyright Act, 1957, § 2(h), No. 14, Acts of Parliament, 1957(India).

[89] See Henning supra note 75 at fn. 32.

[90] See also supra note 84 (elaborating on the disregard towards lex loci laws by Content ID [i.e., automated tools]).

[91] Seltzer v. Sunbrock {1938) 22 F.Supp. 621, at 629; Paul Goldstein, Goldstein on Copyright, 189 (3rd ed., 2008).

[92] The Copyright Act, 1957, § 2(h), No. 14, Acts of Parliament, 1957(India).

[93] Institute for Inner Studies v. Charlotte Anderson MIPR 2014 (1) 129.

[94] See also supra note 84 (the automated enforcement systems do not distinguish between protectible works).

[95] See Robert C Osterberg & Eric C Osterberg, Substantial Similarity in Copyright Law (Practicing Law Institute, 2003).

[96] India TV Independent News Service Pvt. Ltd. and Ors v. Yashraj Films Pvt. Ltd, 2013 (53) PTC 586 (Del), ¶25 (India); Daniel J. Gervais, Improper Appropriation, 23(2) Lewis & Clark Law Review, 599, 600 (2019).

[97] See India TV Independent News Service Pvt. Ltd. and Ors v. Yashraj Films Pvt. Ltd, 2013 (53) PTC 586 (Del) (India); Super Cassettes Industries Ltd. v. Shreya Broadcasting Pvt. Ltd CS (OS) 1372/2009 (Del) (India); Saregama India Ltd. v. Viacom 18 Motion Pictures CS(COMM) 492/2019 (Del) (India).

[98] YouTube Help, supra note 71 and note 79; Supra note 82.

[99] Compare 91 (interpreting ‘dramatic works’ under US copyright laws), 92 (defining ‘dramatic works’ under Indian copyright laws), 93 (interpreting scope of protection for ad libitum works under Indian copyright laws), and 94 (elaborating upon the burden of proof under Content ID), with 98 (elaborating upon the nature of assessment undertaken by Content ID).  

[100] Supra note 73.

[101] Id.

[102] See also How Content ID Works: Common Questions About Content ID, YouTube Help (Oct. 1, 2021, 8:30 AM), (providing for a pre-determined list of remedies that are available to rights-holders).

[103] See infra notes 106 – 107.

[104] See, Facebook, supra note 69; YouTube Help, supra note 64.  

[105] Michael Soha & Zachary J McDowell, supra note 9.

[106] See generally Division De La Recherche Research Division,  Article 7: The “Quality of Law” Requirements and the Principle of (Non-)Retrospectiveness of the Criminal Law under Article 7 of the Convention, European Court of Human Rights p. 6, Oct. 1, 2021, 8:30 AM), (providing for important provisions under penal statutes).

[107] See supra notes 74, 89, 98, and 99.

[108] Id.

[109] Henning, supra note 75.

[110] Shreya Singhal v. Union of India AIR 2015 SC 1523, ¶119 (India); See Manila Principles on Intermediary Liability,, Rule 2.

[111] Kent Ro Systems Ltd & Anr. vs Amit Kotak, 240(2017) DLT3, ¶35 (India).

[112] Supra notes 110 – 111.

[113] Compare 35 (requiring Indian intermediaries to have specific intent / knowledge of infringing content on their platform), 36 (elaborating upon the definitional scope of ‘intermediary’ in India), with 43 (providing for pro-active due diligence obligations from an Indian intermediary), and 108 (elaborating upon the public interest functions of an intermediary employing automated tools).

[114] See also supra notes 106 – 107 (elaborating upon the pre-determined remedies under automated tools).

[115] Id.

[116] See also Part II of this paper (elaborating upon the judicial and legislative attitude towards intermediary’s liability in India).

[117] See Part III (2) of this paper.

[118] Supra notes 45, 46, and 47.

[119] See also Part III (2) of this paper (elaborating on the independent control environment created by automated tools such as Content ID and Rights Manager).

[120] Benjamin Boroughf, The Next Great YouTube: Improving Content ID to Foster Creativity,

Cooperation, and Fair Compensation, Albany Law Journal of Science and Technology, 8 – 10 (2014); Henning supra note 76.

[121]  See also Part III (2) of this paper (elaborating on the independent control environment created by automated tools such as Content ID and Rights Manager).