Chapter 11: DMCA Safe Harbors
The creation and widespread use of the Internet has created a huge potential for copyright infringement liability for internet service providers such as Google, as illustrated by the Perfect 10 decisions excerpted in the indirect infringement section of this book. The following case, Netcom, was decided in the early days of the Internet by a district court judge sitting in San Jose California (the heart of Silicon Valley) who had a reputation for expertise in technology and intellectual property. Netcom is still widely cited and considered one of the seminal decisions of copyright law as it is applied in the context of the Internet. In particular, Judge Whyte’s decision provides an influential discussion of the policy concerns that arise when the traditional doctrines of copyright law are applied with respect to activities that occur on the Internet.
__________
Some things to consider when reading Netcom:
- The fundamental issue raised by this case is the extent to which a party that provides a platform for online filesharing can be held liable for copyright infringement that occurs as a consequence of actions of users of the platform. Netcom is one of the earliest decisions to address this issue. The policy concerns raised by Judge Whyte are clearly relevant for more contemporary filesharing platforms such as YouTube and BitTorrent.
- The court’s discussion of MAI and its holding regarding digital, transient copying of a copyrighted work.
- The court’s discussion of both direct and indirect theories of liability.
- The court’s discussion of copyright policy, particularly in the context of the Internet, which at that time was a newly emerging technology.
- It is often said that copyright infringement is a matter of strict liability, but note that in this case, the court finds a requirement of volition or causation, at least when the alleged infringer has not actively taken steps to infringe, but has instead provided a platform for the infringing activities of others.
- Netcom’s insight regarding volitional conduct, particularly in the context of “cyberspace,” has been endorsed and adopted by appellate courts. See, e.g., CoStar Grp., Inc. v. LoopNet, Inc., 373 F.3d 544, 551 (4th Cir. 2004) (“Netcom made a particularly rational interpretation of § 106 when it concluded that a person had to engage in volitional conduct—specifically, the act constituting infringement—to become a direct infringer. [S]uch a construction of the Act is especially important when it is applied to cyberspace.”); BWP Media USA Inc. v. Polyvore, Inc., 922 F.3d 42, 48 (2d Cir. 2019) (“Ten years ago in Cablevision, we adopted the volitional conduct requirement in this circuit as a prerequisite to establishing copyright infringement liability for service providers, holding that volitional conduct is an important element of direct liability. 536 F.3d at 131.”).
Religious Tech. Center v. Netcom On–Line Comm.
907 F.Supp. 1361 (N.D.Cal.1995)
WHYTE, District Judge.
This case concerns an issue of first impression regarding intellectual property rights in cyberspace. Specifically, this order addresses whether the operator of a computer bulletin board service (“BBS”), and the large Internet access provider that allows that BBS to reach the Internet, should be liable for copyright infringement committed by a subscriber of the BBS.
Plaintiffs Religious Technology Center (“RTC”) and Bridge Publications, Inc. (“BPI”) hold copyrights in the unpublished and published works of L. Ron Hubbard, the late founder of the Church of Scientology (“the Church”). Defendant Dennis Erlich (“Erlich”) is a former minister of Scientology turned vocal critic of the Church, whose pulpit is now the Usenet newsgroup alt.religion.scientology (“a.r.s.”), an on-line forum for discussion and criticism of Scientology. Plaintiffs maintain that Erlich infringed their copyrights when he posted portions of their works on a.r.s. Erlich gained his access to the Internet through defendant Thomas Klemesrud’s (“Klemesrud’s”) BBS “support.com.” Klemesrud is the operator of the BBS, which is run out of his home and has approximately 500 paying users. Klemesrud’s BBS is not directly linked to the Internet, but gains its connection through the facilities of defendant Netcom On–Line Communications, Inc. (“Netcom”), one of the largest providers of Internet access in the United States.
After failing to convince Erlich to stop his postings, plaintiffs contacted defendants Klemesrud and Netcom. Klemesrud responded to plaintiffs’ demands that Erlich be kept off his system by asking plaintiffs to prove that they owned the copyrights to the works posted by Erlich. However, plaintiffs refused Klemesrud’s request as unreasonable. Netcom similarly refused plaintiffs’ request that Erlich not be allowed to gain access to the Internet through its system. Netcom contended that it would be impossible to prescreen Erlich’s postings and that to kick Erlich off the Internet meant kicking off the hundreds of users of Klemesrud’s BBS. Consequently, plaintiffs named Klemesrud and Netcom in their suit against Erlich, although only on the copyright infringement claims.
For the reasons set forth below, the court grants in part and denies in part Netcom’s motion for summary judgment and Klemesrud’s motion for judgment on the pleadings and denies plaintiffs’ motion for a preliminary injunction.
I. NETCOM’S MOTION FOR SUMMARY JUDGMENT OF NONINFRINGEMENT
B. Copyright Infringement
Plaintiffs argue that, although Netcom was not itself the source of any of the infringing materials on its system, it nonetheless should be liable for infringement, either directly, contributorily, or vicariously.
1. Direct Infringement
Infringement consists of the unauthorized exercise of one of the exclusive rights of the copyright holder delineated in section 106. 17 U.S.C. § 501. Direct infringement does not require intent or any particular state of mind, although willfulness is relevant to the award of statutory damages. 17 U.S.C. § 504(c).
Many of the facts pertaining to this motion are undisputed. The court will address the relevant facts to determine whether a theory of direct infringement can be supported based on Netcom’s alleged reproduction of plaintiffs’ works. The court will look at one controlling Ninth Circuit decision addressing copying in the context of computers and two district court opinions addressing the liability of BBS operators for the infringing activities of subscribers. The court will additionally examine whether Netcom is liable for infringing plaintiffs’ exclusive rights to publicly distribute and display their works.
a. Undisputed Facts
The parties do not dispute the basic processes that occur when Erlich posts his allegedly infringing messages to a.r.s. Erlich connects to Klemesrud’s BBS using a telephone and a modem. Erlich then transmits his messages to Klemesrud’s computer, where they are automatically briefly stored. According to a prearranged pattern established by Netcom’s software, Erlich’s initial act of posting a message to the Usenet results in the automatic copying of Erlich’s message from Klemesrud’s computer onto Netcom’s computer and onto other computers on the Usenet. In order to ease transmission and for the convenience of Usenet users, Usenet servers maintain postings from newsgroups for a short period of time—eleven days for Netcom’s system and three days for Klemesrud’s system. Once on Netcom’s computers, messages are available to Netcom’s customers and Usenet neighbors, who may then download the messages to their own computers.
Unlike some other large on-line service providers, such as CompuServe, America Online, and Prodigy, Netcom does not create or control the content of the information available to its subscribers. It also does not monitor messages as they are posted. It has, however, suspended the accounts of subscribers who violated its terms and conditions, such as where they had commercial software in their posted files. Netcom admits that, although not currently configured to do this, it may be possible to reprogram its system to screen postings containing particular words or coming from particular individuals. Netcom, however, took no action after it was told by plaintiffs that Erlich had posted messages through Netcom’s system that violated plaintiffs’ copyrights, instead claiming that it could not shut out Erlich without shutting out all of the users of Klemesrud’s BBS.
b. Creation of Fixed Copies
In the present case, there is no question … that “copies” were created, as Erlich’s act of sending a message to a.r.s. caused reproductions of portions of plaintiffs’ works on both Klemesrud’s and Netcom’s storage devices.
c. Is Netcom Directly Liable for Making the Copies?
The mere fact that Netcom’s system incidentally makes temporary copies of plaintiffs’ works does not mean Netcom has caused the copying. The court believes that Netcom’s act of designing or implementing a system that automatically and uniformly creates temporary copies of all data sent through it is not unlike that of the owner of a copying machine who lets the public make copies with it. Although some of the people using the machine may directly infringe copyrights, courts analyze the machine owner’s liability under the rubric of contributory infringement, not direct infringement. Plaintiffs’ theory would create many separate acts of infringement and, carried to its natural extreme, would lead to unreasonable liability. It is not difficult to conclude that Erlich infringes by copying a protected work onto his computer and by posting a message to a newsgroup. However, plaintiffs’ theory further implicates a Usenet server that carries Erlich’s message to other servers regardless of whether that server acts without any human intervention beyond the initial setting up of the system. It would also result in liability for every single Usenet server in the worldwide link of computers transmitting Erlich’s message to every other computer. These parties, who are liable under plaintiffs’ theory, do no more than operate or implement a system that is essential if Usenet messages are to be widely distributed. There is no need to construe the Act to make all of these parties infringers. Although copyright is a strict liability statute, there should still be some element of volition or causation which is lacking where a defendant’s system is merely used to create a copy by a third party.
2. Contributory Infringement
Netcom is not free from liability just because it did not directly infringe plaintiffs’ works; it may still be liable as a contributory infringer where the defendant, “with knowledge of the infringing activity, induces, causes or materially contributes to the infringing conduct of another.”
a. Knowledge of Infringing Activity
The evidence reveals a question of fact as to whether Netcom knew or should have known that Erlich had infringed plaintiffs’ copyrights following receipt of plaintiffs’ letter. Because Netcom was arguably participating in Erlich’s public distribution of plaintiffs’ works, there is a genuine issue as to whether Netcom knew of any infringement by Erlich before it was too late to do anything about it. If plaintiffs can prove the knowledge element, Netcom will be liable for contributory infringement since its failure to simply cancel Erlich’s infringing message and thereby stop an infringing copy from being distributed worldwide constitutes substantial participation in Erlich’s public distribution of the message.
b. Substantial Participation
Where a defendant has knowledge of the primary infringer’s infringing activities, it will be liable if it “induces, causes or materially contributes to the infringing conduct of” the primary infringer. Such participation must be substantial.
Providing a service that allows for the automatic distribution of all Usenet postings, infringing and noninfringing, goes well beyond renting a premises to an infringer. It is more akin to the radio stations that were found liable for rebroadcasting an infringing broadcast. Netcom allows Erlich’s infringing messages to remain on its system and be further distributed to other Usenet servers worldwide. It does not completely relinquish control over how its system is used, unlike a landlord. Thus, it is fair, assuming Netcom is able to take simple measures to prevent further damage to plaintiffs’ copyrighted works, to hold Netcom liable for contributory infringement where Netcom has knowledge of Erlich’s infringing postings yet continues to aid in the accomplishment of Erlich’s purpose of publicly distributing the postings. Accordingly, plaintiffs do raise a genuine issue of material fact as to their theory of contributory infringement as to the postings made after Netcom was on notice of plaintiffs’ infringement claim.
3. Vicarious Liability
Even if plaintiffs cannot prove that Netcom is contributorily liable for its participation in the infringing activity, it may still seek to prove vicarious infringement based on Netcom’s relationship to Erlich. A defendant is liable for vicarious liability for the actions of a primary infringer where the defendant (1) has the right and ability to control the infringer’s acts and (2) receives a direct financial benefit from the infringement. Unlike contributory infringement, knowledge is not an element of vicarious liability.
a. Right and Ability To Control
The first element of vicarious liability will be met if plaintiffs can show that Netcom has the right and ability to supervise the conduct of its subscribers. Netcom argues that it does not have the right to control its users’ postings before they occur. Plaintiffs dispute this and argue that Netcom’s terms and conditions, to which its subscribers must agree, specify that Netcom reserves the right to take remedial action against subscribers.
Netcom argues that it could not possibly screen messages before they are posted given the speed and volume of the data that goes through its system. Netcom further argues that it has never exercised control over the content of its users’ postings. Plaintiffs’ expert opines otherwise, stating that with an easy software modification Netcom could identify postings that contain particular words or come from particular individuals. Plaintiffs further dispute Netcom’s claim that it could not limit Erlich’s access to Usenet without kicking off all 500 subscribers of Klemesrud’s BBS. As evidence that Netcom has in fact exercised its ability to police its users’ conduct, plaintiffs cite evidence that Netcom has acted to suspend subscribers’ accounts on over one thousand occasions. Further evidence shows that Netcom can delete specific postings. Whether such sanctions occurred before or after the abusive conduct is not material to whether Netcom can exercise control. The court thus finds that plaintiffs have raised a genuine issue of fact as to whether Netcom has the right and ability to exercise control over the activities of its subscribers, and of Erlich in particular.
b. Direct Financial Benefit
Plaintiffs must further prove that Netcom receives a direct financial benefit from the infringing activities of its users. For example, a landlord who has the right and ability to supervise the tenant’s activities is vicariously liable for the infringements of the tenant where the rental amount is proportional to the proceeds of the tenant’s sales. However, where a defendant rents space or services on a fixed rental fee that does not depend on the nature of the activity of the lessee, courts usually find no vicarious liability because there is no direct financial benefit from the infringement.
Plaintiffs cannot provide any evidence of a direct financial benefit received by Netcom from Erlich’s infringing postings. Netcom receives a fixed fee. There is no evidence that infringement by Erlich, or any other user of Netcom’s services, in any way enhances the value of Netcom’s services to subscribers or attracts new subscribers.
Because plaintiffs have failed to raise a question of fact on this vital element, their claim of vicarious liability fails.
__________
Check Your Understanding – Netcom
Question 1. Why does the Netcom court believe that Netcom’s act of designing or implementing a system that automatically and uniformly creates temporary copies of all data sent through it is not unlike that of the owner of a copying machine who lets the public make copies with it?
Question 2. True or false: According to Netcom, although copyright is a strict liability statute, there should still be some element of volition or causation which is lacking where a defendant’s system is merely used to create a copy by a third party.
The Digital Millennium Copyright Act of 1998 (DMCA)
Netcom illustrates the potential for copyright liability that arises when a party provides an online platform for users to post content. Today, YouTube is one of the most prominent platforms of this type. Similar to the situation in Netcom, YouTube provides an online platform which permits its users to post content that can be viewed by other users. Inevitably, some users will post copyrighted materials without the authorization of copyright owners, thereby infringing one or more of the copyright owner’s rights, e.g., the reproduction, derivative work, public performance, public display, and/or distribution rights. Can (and should) YouTube be held liable for copyright infringement? To what extent does YouTube have a duty to identify and take down or block the posting of infringing content?
Consider the potential for liability for indirect infringement by a company like YouTube, under the theories of both contributory and vicarious infringement. Recall that a party can be held liable for contributory infringement if it (1) has knowledge of the infringement of another and (2) materially contributes to that infringement. There is no question that massive copyright infringement has occurred on YouTube, and continues to this day. One could argue that YouTube has sufficient knowledge of the infringement, and that its platform materially contributes to the infringement, which would potentially expose YouTube to huge liability for copyright infringement. Similarly, one could argue that YouTube’s hosting of its platform constitutes vicarious infringement, if YouTube has the right and ability to control the use of its platform, and the company profits from infringement that occurs.
Congress enacted the Digital Millennium Copyright Act of 1998 (DMCA) to address a number of concerns arising out of the digitization of copyrighted content, and in this regard one of the most important things the DMCA did was to establish several “safe harbors” from copyright infringement liability for Internet service providers (“ISPs”)1 like YouTube. To qualify for the safe harbors, ISPs must comply with certain requirements, which include a requirement that the ISP provide a process by which copyright owners can notify the ISP when infringing content appears on its site, and that upon receiving such notice, the ISP must expeditiously “take down” the material.
An ISP that qualifies for the safe harbor cannot be held liable for infringement that occurs as the result of the actions of its users. But note that an ISP that fails to qualify for the DMCA safe harbors is not necessarily liable for copyright infringement; a copyright owner plaintiff would still have to establish liability under some theory of direct or indirect infringement.
The following cases explore some contours of the DMCA safe harbor provisions.
Some things to consider when reading Capitol Recs.:
- The background and summary of DMCA safe harbor provisions of § 512(c), which are available for qualifying Internet service providers for “infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider.”
- The court’s discussion of the policy rationales underlying the DMCA safe harbors.
- The meaning of “red flag knowledge,” why it is relevant, and why the plaintiffs fail to prove it. Note the court’s discussion of Viacom Int’l, Inc. v. YouTube, Inc., 676 F.3d 19 (2d Cir. 2012) and its holding with respect to the difference between actual and red flag knowledge.
- The court’s discussion of burdens of proof in the context of DMCA safe harbors.
- The relationship between the “willful blindness doctrine” and § 512(m). What did Viacom have to say about this?
Capitol Recs., LLC v. Vimeo, LLC
826 F.3d 78 (2d Cir. 2016)
Leval, Circuit Judge:
This is an interlocutory appeal on certified questions from rulings of the United States District Court for the Southern District of New York interpreting the Digital Millennium Copyright Act of 1998 (“DMCA”). The DMCA establishes a safe harbor in § 512(c), which gives qualifying Internet service providers protection from liability for copyright infringement when their users upload infringing material on the service provider’s site and the service provider is unaware of the infringement. 17 U.S.C. § 512(c). Defendant Vimeo, LLC is an Internet service provider, which operates a website on which members can post videos of their own creation, which are then accessible to the public at large. Plaintiffs are record companies and music publishing companies, which own copyrights in sound recordings of musical performances. Their complaint alleges that Vimeo is liable to Plaintiffs for copyright infringement by reason of 199 videos posted on the Vimeo website, which contained allegedly infringing musical recordings for which Plaintiffs owned the rights.
The district court ruled on motions for partial summary judgment addressed to whether Vimeo was entitled to the DMCA’s safe harbor protections. As for videos that allegedly infringed pre-1972 sound recordings, the court ruled in Plaintiffs’ favor on the theory that § 512(c)’s safe harbor absolves a service provider only from copyright liability based on the federal copyright statute, which does not apply to pre-1972 sound recordings, which are protected only by state copyright laws.2 With respect to post-1972 sound recordings (which all agree are protected by the DMCA’s safe harbor when its conditions are met), the district court granted summary judgment to Vimeo as to 153 videos, mostly on the basis that Plaintiffs lacked evidence that Vimeo’s employees had viewed them. The court rejected Plaintiffs’ arguments that knowledge should be imputed to Vimeo by reason of its alleged general policy of willful blindness to infringement of sound recordings. And as for the remaining challenged videos that incorporated post-1972 sound recordings, the court denied summary judgment to either side, concluding that there was a question of material fact whether Vimeo possessed so-called “red flag” knowledge of circumstances that made infringement apparent, which would make Vimeo ineligible for the protection of the safe harbor under the terms of § 512(c). This interlocutory appeal focuses on three issues: (i) whether the safe harbor of § 512(c) applies to pre-1972 sound recordings; (ii) whether evidence of some viewing by Vimeo employees of videos that played all or virtually all of “recognizable” copyrighted songs was sufficient to satisfy the standard of red flag knowledge, which would make Vimeo ineligible for the DMCA safe harbor; and (iii) whether Plaintiffs have shown that Vimeo had a general policy of willful blindness to infringement of sound recordings, which would justify imputing to Vimeo knowledge of the specific infringements.
BACKGROUND
I. The DMCA
The DMCA was enacted in 1998 to implement the World Intellectual Property Organization Copyright Treaty and to update domestic copyright for the digital age. According to its legislative history, Title II, the Online Copyright Infringement Liability Limitation Act was designed to clarify the liability faced by service providers who transmit potentially infringing material over their networks and in the process to ensure that the efficiency of the Internet will continue to improve and that the variety and quality of services on the Internet will expand. The Senate Report expressed the view that without clarification of their liability, service providers may hesitate to make the necessary investment in the expansion of the speed and capacity of the Internet. To that end, the DMCA established four safe harbors, codified at 17 U.S.C. § 512, which protect qualifying Internet service providers from liability for certain claims of copyright infringement. This case focuses on the safe harbor provided by § 512(c), which is supplemented by protections provided in § 512(m).
These portions of the statute undertake, through complex provisions, to establish a compromise, which, on the one hand, augments the protections available to copyright owners, and, on the other, insulates service providers from liability for infringements of which they are unaware, contained in material posted to their sites by users, so as to make it commercially feasible for them to provide valuable Internet services to the public.
The Act augments the rights of copyright owners by establishing a notice-and-takedown regime. The notice-and-takedown regime requires a service provider, to preserve its eligibility for the safe harbor, to “expeditiously … remove … material that is claimed to be infringing,” or disable access to it, whenever the service provider (1) receives a notice of infringing material on the service provider’s site or (2) otherwise becomes aware of the infringement or of circumstances making the infringement apparent. § 512(c)(1)(C), (A)(iii). The provisions favoring Internet service providers, first, immunize those that qualify for the statute’s benefits from liability for copyright infringements posted by users on the providers’ websites if the service providers are unaware of the infringements, and, second, expressly relieve them of any obligation to monitor the postings of users to detect infringements as a condition of qualifying for the safe harbor. Service providers, however, forfeit entitlement to the safe harbor if they fail to expeditiously remove the infringing material upon receipt of notification of the infringement or upon otherwise becoming aware of it.
The terms summarized above are set forth in the following statutory provisions:
(c)(1) In general.—A service provider shall not be liable for monetary relief, or [with certain exceptions] for injunctive or other equitable relief, for infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider, if the service provider—
(A)
(i) does not have actual knowledge that the material or an activity using the material on the system or network is infringing;
(ii) in the absence of such actual knowledge, is not aware of facts or circumstances from which infringing activity is apparent; or
(iii) upon obtaining such knowledge or awareness, acts expeditiously to remove, or disable access to, the material;
(B) does not receive a financial benefit directly attributable to the infringing activity, in a case in which the service provider has the right and ability to control such activity; and
(C) upon notification of claimed infringement as described in paragraph (3), responds expeditiously to remove, or disable access to, the material that is claimed to be infringing or to be the subject of infringing activity.
§ 512(c)(1)(A)-(C).
(m) Protection of privacy.—Nothing in this section shall be construed to condition the applicability of [the safe harbor of § 512(c)] on—
(1) a service provider monitoring its service or affirmatively seeking facts indicating infringing activity, [with exceptions not relevant to this inquiry].
§ 512(m)(1).
II. Vimeo’s Website
Vimeo has had great success as a site for the storage and exhibition of videos. Its Website hosts a wide array of home videos, documentaries, animation, and independent films. Founded in 2005, as of 2012 it hosted more than 31 million videos and had 12.3 million registered users in 49 countries. Approximately 43,000 new videos are uploaded to Vimeo each day. Users post videos onto the website without the intervention or active involvement of Vimeo staff, and Vimeo staff do not watch or prescreen videos before they are made available on the website. When a video is uploaded, it is automatically converted to Vimeo’s format and stored on Vimeo’s servers. Users can view the videos stored on Vimeo servers through a “streaming” process by visiting the website, and in many instances can download them.
All Vimeo users must accept its Terms of Service. These require, inter alia, that: users upload (1) only videos that they have created or participated in creating, and (2) only videos for which they possess all necessary rights and that do not infringe on any third party rights. Vimeo’s “Community Guidelines” also provide content restrictions and information about its copyright policy. Every time a user uploads a video, the Website displays three rules: (1) “I will upload videos I created myself,” (2) “I will not upload videos intended for commercial use,” and (3) “I understand that certain types of content are not permitted on Vimeo.” Nonetheless, users have the technical ability to upload videos that do not comply with the rules.
Vimeo employs a “Community Team” of 16 employees to curate content. These employees identify some videos with a “like” sign, occasionally prepare commentary on a video, offer technical assistance to users, participate in forum discussions, and at times inspect videos suspected of violating Vimeo’s policies. So far as we are aware, the record does not indicate that the videos as to which the district court denied summary judgment were inspected by the Community Team for the purpose of detecting infringement.
Vimeo uses multiple computer programs (“Mod Tools”) that assist its Community Team in locating and removing videos that may contain content that violates the Terms of Service. When videos and/or users are identified by one of these tools, Vimeo staff review them individually. Vimeo also enables users to “flag” videos that they believe violate the Terms of Service. Community Moderators evaluate the flagged content and decide whether or not to remove it. The flagging interface also explains how to submit a DMCA claim.
Between October 2008 and November 2010, Vimeo deleted at least 4,000 videos in response to takedown notices by copyright owners. On the three identified occasions in which Plaintiffs had sent Vimeo takedown notices, the district court found that Vimeo had responded “expeditious[ly].” Plaintiffs did not send takedown notices regarding the videos involved in this suit.
While it appears that Vimeo followed a practice of screening the visual content of posted videos for infringement of films, it did not screen the audio portions for infringement of sound recordings. Plaintiffs contend that this fact, together with statements made by Vimeo employees (found in emails), show indifference and willful blindness to infringement of recorded music, and that Vimeo has furthermore actively encouraged users to post infringing videos. Plaintiffs’ evidence of such statements by Vimeo employees included the following:
• Dalas Verdugo, a “Community Director” at Vimeo, responded to a user’s question that he “see[s] all the time at vime[o] videos, (for example Lip-dub) music being used that is copyrig[ht]ed, is there any problem with this?” by telling the user “[w]e allow it, however, if the copyright holder sent us a legal takedown notice, we would have to comply.”
• Blake Whitman, a member of Vimeo’s Community Team, responded to a question regarding Vimeo’s “policy with copyrighted music used as audio for original video content” by telling the user, “[d]on’t ask, don’t tell ;).”
• On another occasion, Whitman responded to a user who asked about using a Radiohead song in a posted video by writing, “We can’t officially tell you that using copyright music is okay. But ….”
• Andrea Allen, a member of Vimeo’s Community Team, received a message from a user providing a link to a video and stating, “I have noticed several people using copyrighted music on Vimeo. What do you do about this?” Allen forwarded the e-mail internally with the comment “[i]gnoring, but sharing.”
• In a response to an email asking whether a user would have copyright “issues” with adding the copyrighted song “Don’t Worry, Be Happy” by Bobby McFerrin as the “soundtrack” to a home video, Allen responded: “The Official answer I must give you is: While we cannot opine specifically on the situation you are referring to, adding a third party’s copyrighted content to a video generally (but not always) constitutes copyright infringement under applicable laws … Off the record answer … Go ahead and post it ….”
• In an e-mail sent to Whitman and Verdugo (and also to all@vimeo.com), Andrew Pile, the Vice President of Product and Development at Vimeo, wrote: “Who wants to start the felons group, where we just film shitty covers of these [Plaintiff EMI] songs and write ‘FUCK EMI’ at the end?”
DISCUSSION
I. Pre-1972 Recordings
The first question we consider is whether the district court erred in granting partial summary judgment to Plaintiffs, rejecting the availability of the DMCA’s safe harbor for infringement of sound recordings fixed prior to February 15, 1972.
[Editor’s note: The court concluded that the safe harbor established by § 512(c) protects a qualifying service provider from liability for infringement of copyright under state law, and therefore vacated the district court’s grant of summary judgment to Plaintiffs as to the availability of the DMCA safe harbor to Vimeo in relation to liability for infringement of pre-1972 sound recordings.]
II. Red Flag Knowledge of Infringement
The second certified question is “Whether, under Viacom Int’l, Inc. v. YouTube, Inc.,3 a service provider’s viewing of a user-generated video containing all or virtually all of a recognizable, copyrighted song may establish ‘facts and circumstances’ giving rise to ‘red flag’ knowledge of infringement” within the meaning of § 512(c)(1)(A)(ii). We consider this question in relation to the district court’s denial of Vimeo’s motion for summary judgment on a number of videos that conform to the facts specified in the district court’s question. The district court’s formulation of the question in connection with its ruling suggests that the court based its denial on the presence of the facts specified in the question. We conclude that Plaintiffs’ establishment of those facts is insufficient to prove red flag knowledge. We therefore vacate the court’s order denying Vimeo summary judgment as to red flag knowledge with respect to those videos.
Our court explained in Viacom that, in order to be disqualified from the benefits of the safe harbor by reason of red flag knowledge under § 512(c)(1)(A)(ii), the service provider must have actually known facts that would make the specific infringement claimed objectively obvious to a reasonable person.
The difference between actual and red flag knowledge is … not between specific and generalized knowledge, but instead between a subjective and an objective standard. In other words, the actual knowledge provision turns on whether the provider actually or ‘subjectively’ knew of specific infringement, while the red flag provision turns on whether the provider was subjectively aware of facts that would have made the specific infringement ‘objectively’ obvious to a reasonable person.
Viacom, 676 F.3d at 31.
The hypothetical “reasonable person” to whom infringement must be obvious is an ordinary person—not endowed with specialized knowledge or expertise concerning music or the laws of copyright. Furthermore, as noted above, § 512(m) makes clear that the service provider’s personnel are under no duty to “affirmatively seek[ ]” indications of infringement. The mere fact that an employee of the service provider has viewed a video posted by a user (absent specific information regarding how much of the video the employee saw or the reason for which it was viewed), and that the video contains all or nearly all of a copyrighted song that is “recognizable,” would be insufficient for many reasons to make infringement obvious to an ordinary reasonable person, who is not an expert in music or the law of copyright. Because the district court’s denial of Vimeo’s motion for summary judgment and concomitant certification of this question suggest that the district court believed that the evidence described in the question, without more, could render the service provider ineligible for the safe harbor, and relied on this proposition to deny summary judgment in every instance in which there was evidence that an employee of Vimeo had seen at least a portion of a video that contained substantially all of a “recognizable” copyrighted song, we vacate the district court’s ruling on this question and remand for reconsideration in light of our further discussion of the standard for red flag knowledge.
A significant aspect of our ruling relates to the burdens of proof on the question of the defendant’s entitlement to the safe harbor—particularly with respect to the issue of red flag knowledge. The issue is potentially confusing because of the large numbers of factual questions that can arise in connection with a claim of the safe harbor. A service provider’s entitlement to the safe harbor is properly seen as an affirmative defense, and therefore must be raised by the defendant. The defendant undoubtedly bears the burden of raising entitlement to the safe harbor and of demonstrating that it has the status of service provider, as defined, and has taken the steps necessary for eligibility. On the other hand, on the question whether the service provider should be disqualified based on the copyright owner’s accusations of misconduct—i.e., by reason of the service provider’s failure to act as the statute requires after receiving the copyright owner’s notification or otherwise acquiring actual or red flag knowledge—the burden of proof more appropriately shifts to the plaintiff. The service provider cannot reasonably be expected to prove broad negatives, providing affidavits of every person who was in its employ during the time the video was on its site, attesting that they did not know of the infringement and did not know of the innumerable facts that might make infringement obvious. And to read the statute as requiring a trial whenever the plaintiff contests the credibility of such attestations would largely destroy the benefit of the safe harbor Congress intended to create.
The Nimmer copyright treatise, noting Congress’s failure to prescribe a roadmap, and observing that “courts [must] muddle through,” furnishes valuable guidance on the shifting allocation of burdens of proof as to a service provider’s entitlement to the protection of the safe harbor. See MELVILLE B. NIMMER & DAVID NIMMER, NIMMER ON COPYRIGHT § 12B.04[A][1][d], n.145 (2015). According to Nimmer, the service provider initially establishes entitlement to the safe harbor by showing that it meets the statutory definition of an eligible service provider (on whose website the allegedly infringing material was placed by a user), and that it has taken the general precautionary steps against infringement that are specified in the statute. The service provider could nonetheless be denied the safe harbor if the plaintiff-rightsholder showed that the service provider had actual knowledge, or red flag knowledge, of the infringement. The burden of proof with respect to actual or red flag knowledge would be on the plaintiff.
Acknowledging that the burden lies on the defendant to establish the affirmative defense, Nimmer explains,
It would seem that defendant may do so [establish entitlement to the safe harbor] by demonstrating that it qualifies as a service provider under the statutory definition, which has established a repeat infringer policy and follows the requisite technical measures. In terms of mental state, the burden would then appear to shift back to plaintiff. To disqualify defendant from the safe harbor, the copyright claimant must show defendant’s actual knowledge or a ‘red flag’ waving in its face. But defendant can still qualify for the safe harbor if, after gaining the requisite mental state, it acted expeditiously to disable access to the infringing content. As to that last matter [expeditious take-down], the burden would seem to rest on defendant.
We agree with Nimmer’s proposed allocation of shifting burdens of proof. Proper allocation of the burden of proof will necessarily have an important bearing on determining entitlements to summary judgment. Following Nimmer’s cogent analysis, it appears that a defendant would, in the first instance, show entitlement to the safe harbor defense by demonstrating its status as a service provider that stores users’ material on its system, that the allegedly infringing matter was placed on its system by a user, and that it has performed precautionary, protective tasks required by § 512 as conditions of eligibility, including that it adopted and reasonably implemented a policy designed to exclude users who repeatedly infringe, that it designated an agent for receipt of notices of infringement, and that it accommodates standard technical measures used by copyright owners to detect infringements.
On the issue of disqualifying knowledge, however, the burden falls on the copyright owner to demonstrate that the service provider acquired knowledge of the infringement, or of facts and circumstances from which infringing activity was obvious, and failed to promptly take down the infringing matter, thus forfeiting its right to the safe harbor. The plaintiff is, of course, entitled to take discovery of the service provider to enable it to make this showing.
A copyright owner’s mere showing that a video posted by a user on the service provider’s site includes substantially all of a recording of recognizable copyrighted music, and that an employee of the service provider saw at least some part of the user’s material, is insufficient to sustain the copyright owner’s burden of proving that the service provider had either actual or red flag knowledge of the infringement. That is so for many reasons.
First, the employee’s viewing might have been brief. The fact that an employee viewed enough of a video to post a brief comment, add it to a channel (such as kitten videos) or hit the “like” button, would not show that she had ascertained that its audio track contains all or virtually all of a piece of music.
Second, the insufficiency of some viewing by a service provider’s employee to prove the viewer’s awareness that a video contains all or virtually all of a song is all the more true in contemplation of the many different business purposes for which the employee might have viewed the video. The purpose of the viewing might include application of technical elements of computer expertise, classification by subject matter, sampling to detect inappropriate obscenity or bigotry, and innumerable other objectives having nothing to do with recognition of infringing music in the soundtrack. Furthermore, the fact that music is “recognizable” (which, in its dictionary definition of “capable of being recognized” would seem to apply to all music that is original and thus distinguishable from other music), or even famous (which is perhaps what the district court meant by “recognizable”), is insufficient to demonstrate that the music was in fact recognized by a hypothetical ordinary individual who has no specialized knowledge of the field of music. Some ordinary people know little or nothing of music. Lovers of one style or category of music may have no familiarity with other categories. For example, 60-year-olds, 40-year-olds, and 20-year-olds, even those who are music lovers, may know and love entirely different bodies of music, so that music intimately familiar to some may be entirely unfamiliar to others.
Furthermore, employees of service providers cannot be assumed to have expertise in the laws of copyright. Even assuming awareness that a user posting contains copyrighted music, the service provider’s employee cannot be expected to know how to distinguish, for example, between infringements and parodies that may qualify as fair use. Nor can every employee of a service provider be automatically expected to know how likely or unlikely it may be that the user who posted the material had authorization to use the copyrighted music. Even an employee who was a copyright expert cannot be expected to know when use of a copyrighted song has been licensed. Additionally, the service provider is under no legal obligation to have its employees investigate to determine the answers to these questions.
It is of course entirely possible that an employee of the service provider who viewed a video did have expertise or knowledge with respect to the market for music and the laws of copyright. The employee may well have known that the work was infringing, or known facts that made this obvious. The copyright owner is entitled to discovery in order to obtain the specific evidence it needs to sustain its burden of showing that the service provider did in fact know of the infringement or of facts that made infringement obvious. But the mere fact that a video contains all or substantially all of a piece of recognizable, or even famous, copyrighted music and was to some extent viewed (or even viewed in its entirety) by some employee of a service provider would be insufficient (without more) to sustain the copyright owner’s burden of showing red flag knowledge.
In sum, a showing by plaintiffs of no more than that some employee of Vimeo had some contact with a user-posted video that played all, or nearly all, of a recognizable song is not sufficient to satisfy plaintiffs’ burden of proof that Vimeo forfeited the safe harbor by reason of red flag knowledge with respect to that video. As it appears that the district court employed that inappropriate standard as the basis for its denial of Vimeo’s motion for summary judgment on numerous videos conforming to that description, we vacate those rulings and remand for further consideration. Vimeo is entitled to summary judgment on those videos as to the red flag knowledge issue, unless plaintiffs can point to evidence sufficient to carry their burden of proving that Vimeo personnel either knew the video was infringing or knew facts making that conclusion obvious to an ordinary person who had no specialized knowledge of music or the laws of copyright.
III. Willful Blindness
Our final issue on this appeal involves Plaintiffs’ contention that the district court, in rejecting their claim of willful blindness, misapplied our teachings in Viacom, which recognized that “the willful blindness doctrine may be applied, in appropriate circumstances, to demonstrate knowledge or awareness of specific instances of infringement under the DMCA.” We disagree with Plaintiffs’ argument and see no reason to disturb the district court’s ruling.
Plaintiffs essentially make three arguments. First, based on evidence that Vimeo monitored videos for infringement of visual content but not for infringement of audio content, they argue that they have demonstrated willful blindness to infringement of music, which justifies liability under Viacom. Their second argument is that Vimeo’s awareness of facts suggesting a likelihood of infringement gave rise to a duty to investigate further, and that Vimeo’s failure to do so showed willful blindness that justifies liability. Finally, they argue that, having encouraged users to post infringing matter, Vimeo could not then close its eyes to the resulting infringements without liability.
The first two arguments are easily disposed of. As we made clear in Viacom, § 512(m) relieves the service provider of obligation to monitor for infringements posted by users on its website. We see no reason why Vimeo’s voluntary undertaking to monitor videos for infringement of visual material should deprive it of the statutory privilege not to monitor for infringement of music. Plaintiffs’ argument is refuted by § 512(m).
Their second argument, that awareness of facts suggesting a likelihood of infringement gave rise to a duty to investigate further, does not fare better. Section 512(c) specifies the consequences of a service provider’s knowledge of facts that might show infringement. If the service provider knows of the infringement, or learns of facts and circumstances that make infringement obvious, it must act expeditiously to take down the infringing matter, or lose the protection of the safe harbor. But we can see no reason to construe the statute as vitiating the protection of § 512(m) and requiring investigation merely because the service provider learns facts raising a suspicion of infringement (as opposed to facts making infringement obvious). Protecting service providers from the expense of monitoring was an important part of the compromise embodied in the safe harbor. Congress’s objective was to serve the public interest by encouraging Internet service providers to make expensive investments in the expansion of the speed and capacity of the Internet by relieving them of burdensome expenses and liabilities to copyright owners, while granting to the latter compensating protections in the service providers’ takedown obligations. If service providers were compelled constantly to take stock of all information their employees may have acquired that might suggest the presence of infringements in user postings, and to undertake monitoring investigations whenever some level of suspicion was surpassed, these obligations would largely undo the value of § 512(m). We see no merit in this argument.
Plaintiffs’ third argument may fare better in theory, but is not supported by the facts of this case, at least as we understand them. In Viacom, we made clear that actual and red flag knowledge under the DMCA ordinarily must relate to “specific infringing material,” and that, because willful blindness is a proxy for knowledge, it too must relate to specific infringements. Plaintiffs argue, however, that Vimeo, in order to expand its business, actively encouraged users to post videos containing infringing material. They argue that, notwithstanding the formulation in Viacom, a service provider cannot adopt a general policy of urging or encouraging users to post infringing material and then escape liability by hiding behind a disingenuous claim of ignorance of the users’ infringements.
We need not decide whether Plaintiffs’ proposed gloss on Viacom is correct as a matter of law. Assuming that it is, Plaintiffs still cannot rely on such a theory in this instance. The evidence cited to us by Plaintiffs, consisting of a handful of sporadic instances (amongst the millions of posted videos) in which Vimeo employees inappropriately encouraged users to post videos that infringed music cannot support a finding of the sort of generalized encouragement of infringement supposed by their legal theory. It therefore cannot suffice to justify stripping Vimeo completely of the protection of § 512(m). Moreover, because that evidence was not shown to relate to any of the videos at issue in this suit, it is insufficient to justify a finding of red flag knowledge, under the principle of Viacom, as to those specific videos. Thus, notwithstanding a few unrelated instances in which its employees improperly encouraged specific infringements, Vimeo can still assert the protection of § 512(m) for the present suit, and claim the benefit of the safe harbor, in the absence of a showing by Plaintiffs of facts sufficient to demonstrate that Vimeo, having actual or red flag knowledge of infringement in the videos that are the subject of Plaintiffs’ suit, failed to promptly take them down.
__________
Check Your Understanding – Capitol Recs.
Question 1. Under what circumstances does the §512(c) safe harbor apply?
Question 2. True or false: If the court had found that Vimeo did not qualify for the §512(c) safe harbor, Vimeo would have been liable for copyright infringement.
Question 3. According to the Vimeo court, what is required in order to establish “red flag knowledge” under §512(c)(1)(A)(ii)?
Some things to consider when reading EMI Christian Music:
- This is another decision pertaining to the § 512(c) safe harbor for “infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider.”
- The court’s treatment of the requirement that internet service providers “implement” a “repeat infringer” policy.
- The discussion of “red flag knowledge” and “willful blindness.”
EMI Christian Music Grp., Inc. v. MP3tunes, LLC
844 F.3d 79 (2d Cir. 2016)
LOHIER, Circuit Judge:
In this appeal we principally consider the requirement of the Digital Millennium Copyright Act (“DMCA”) safe harbor that an internet service provider “adopt[ ] and reasonably implement[ ]” a policy to terminate “repeat infringers.” 17 U.S.C. § 512. Plaintiffs-appellees-cross-appellants are all record companies and music publishers. They filed this copyright infringement lawsuit against MP3tunes, LLC and its founder and Chief Executive Officer Michael Robertson, alleging that two internet music services created by MP3tunes infringed their copyrights in thousands of sound recordings and musical compositions. The two services are MP3tunes.com, which primarily operated as a locker service for storing digital music, and sideload.com, which allowed users to search for free music on the internet.
On summary judgment, the United States District Court for the Southern District of New York granted partial summary judgment to the defendants, holding that MP3tunes had a reasonably implemented repeat infringer policy under § 512. A jury ultimately returned a verdict in favor of the plaintiffs, but the District Court partially overturned the verdict.
For reasons we explain below: (1) we vacate the District Court’s grant of partial summary judgment to the defendants based on its conclusion that MP3tunes qualified for safe harbor protection under the DMCA because the District Court applied too narrow a definition of “repeat infringer”; (2) we reverse the District Court’s grant of judgment as a matter of law to the defendants on claims that MP3tunes permitted infringement of plaintiffs’ copyrights in pre-2007 MP3s and Beatles songs because there was sufficient evidence to allow a reasonable jury to conclude that MP3tunes had red-flag knowledge of, or was willfully blind to, infringing activity involving those categories of protected material; (3) we remand for further proceedings related to claims arising out of the District Court’s grant of partial summary judgment; and (4) we affirm the judgment in all other respects.
BACKGROUND
I. MP3tunes.com and Sideload.com
Robertson founded MP3tunes in 2005. It is undisputed that, by then, Robertson was familiar with both the online music industry and copyright litigation, having previously run the music site MP3.com, against which a copyright infringement judgment was entered in 2000. See UMG Recordings, Inc. v. MP3.com, Inc., 92 F.Supp.2d 349, 350 (S.D.N.Y. 2000). While recruiting for MP3tunes several years later, Robertson emphasized, “[T]his will be VERY big news. Major labels selling MP3s. MP3.com guy back to rejuvenate MP3 business. Largest copyright infringer of all time back at it again. Lots of juicy press angles.”
MP3tunes.com was MP3tunes’s first project. Initially, customers could visit MP3Tunes.com and purchase MP3 versions of music created by musicians who were not associated with major record labels. In 2005 MP3tunes.com added a “locker storage” service, which charged users a fee to store music on the MP3tunes server. A user who uploaded songs to her “locker” (through LockerSync, a free plugin on the site) could play the music through other internet-enabled devices.
MP3tunes owned and operated a second website, sideload.com, that allowed users to search for free music on the internet. Sideload.com offered a free plug-in to enable users to “sideload” (or, to use Robertson’s definition of “sideload,” enabled users to “download[ ]” directly to their MP3tunes lockers) free songs that they found on the internet. Songs sideloaded into users’ lockers were then added to sideload.com’s index of searchable songs. This meant that the more songs users sideloaded from the internet, the more free music became available for sideload.com users to stream, download, or sideload into their own lockers. MP3tunes’s executives, including Robertson, used their own accounts with MP3tunes to store sideloaded songs.
Users of MP3tunes.com could store a certain amount of music through the service for free and could purchase additional storage space for tiered fees, while storage associated with sideloaded songs did not count against the free storage limit. At Robertson’s direction, MP3tunes strove to expand sideload.com’s catalog by encouraging users to upload songs to the sideload.com index. For example, members of MP3tunes’s staff were encouraged to upload songs from their own accounts, even when those songs came from websites that appeared to contain infringing material. Robertson directed MP3tunes employee Sharmaine Lindahl to provide MP3tunes employees a list of sites featuring free MP3s “for sideloading purposes.” Lindahl observed that one of the sites on the list “look[ed] to be mainly pirated music.” MP3tunes also encouraged repeated sideloading among MP3tunes.com users by creating a “Sideload Hall of Fame” consisting of the “top 25 Sideloaders with accounts at MP3tunes.com.”
DISCUSSION
I. The Plaintiffs’ Appeal
The plaintiffs challenge three rulings made by the District Court: first, that MP3tunes reasonably implemented a repeat infringer policy and was eligible for DMCA safe harbor protection for pre- and post-1972 songs; second, that the jury’s finding of red-flag knowledge or willful blindness with respect to certain categories of songs was wrong as a matter of law; and third, that the plaintiffs were entitled to only one award of statutory damages for songs where the copyright to the musical composition and the copyright to the sound recording were owned by different holders.
We address each of these challenges.
A. “Reasonably Implemented” Repeat Infringer Policy
The DMCA shields a service provider from liability for the infringing acts of its users if the provider satisfies certain conditions. One of those conditions requires that a service provider “adopt[ ] and reasonably implement[ ] … a policy that provides for the termination in appropriate circumstances of subscribers and account holders of the service provider’s system or network who are repeat infringers.” 17 U.S.C. § 512(i)(1)(A). The plaintiffs argue that MP3tunes never reasonably implemented a repeat-infringer policy because it failed to track users who repeatedly created links to infringing content in the sideload.com index or who copied files from those links, which appeared on multiple takedown notices sent to MP3tunes.
In addressing this argument, we answer two questions: first, whether certain MP3tunes users qualified as “repeat infringers”; and second, if so, whether MP3tunes reasonably implemented a policy directed at them.
We begin with the first question. The District Court held that “[t]he purpose of subsection 512(i) is to deny protection to websites that tolerate users who flagrantly disrespect copyrights.” For the purposes of § 512(i)(1)(A), it defined a “repeat infringer” as a user who posts or uploads infringing content “to the internet for the world to experience or copy” knowing that the conduct infringes another’s copyright. In contrast, the District Court believed, a user who downloads or copies “songs from third-party sites for their personal entertainment” could not be a “repeat infringer.” The District Court thus concluded that only users who upload infringing content are “blatant infringers that internet service providers are obligated to ban from their websites.”
We reject this definition of a “repeat infringer,” which finds no support in the text, structure, or legislative history of the DMCA.
Starting with the text, we note that the DMCA does not itself define “repeat infringers.” But where a statute does not define a term, we give the term its ordinary meaning. In this context, we take “repeat” to mean “a person who does something … again or repeatedly,” Oxford English Dictionary (3d ed. 2009), while an “infringer” is “[s]omeone who interferes with one of the exclusive rights of a … copyright,” Infringer, Black’s Law Dictionary (10th ed. 2014). Copyright infringement is a strict liability offense in the sense that a plaintiff is not required to prove unlawful intent or culpability, see Cartoon Network LP, LLLP v. CSC Holdings, Inc., 536 F.3d 121, 130 (2d Cir. 2008); Shapiro, Bernstein & Co. v. H. L. Green Co., 316 F.2d 304, 308 (2d Cir. 1963), and a user does not have to share copyrighted works in order to infringe a copyright. In the context of this case, all it took to be a “repeat infringer” was to repeatedly sideload copyrighted material for personal use.
We turn next to the structure and context of the DMCA, interpreting the term “repeat infringer” against the backdrop of the statute as a whole. It is important to recall that the DMCA imposes certain requirements on service providers in exchange for limitations on liability. It would make little sense to link that limitation on liability to the knowledge of users. Indeed, the DMCA explicitly relieves service providers from having to affirmatively monitor their users for infringement—something that would likely be required should MP3tunes have to ascertain its users’ knowledge. See 17 U.S.C. § 512(m)(1) (“Nothing in this section shall be construed to condition the applicability of [the DMCA safe harbors] on a service provider monitoring its service or affirmatively seeking facts indicating infringing activity.”).
The legislative history also confirms our view that the District Court’s definition of “repeat infringer” as limited to willful infringement is too narrow. The Senate and House reports accompanying the DMCA recognize a difference between inadvertent and willful infringement. But both reports also assert that a “repeat infringer” requirement is meant to deter those “who repeatedly or flagrantly abuse their access to the Internet through disrespect for the intellectual property rights of others.” In other words, the legislative history of the DMCA indicates that a “repeat infringer” does not need to know of the infringing nature of its online activities.
Finally, none of our sister circuits has adopted the District Court’s definition of “repeat infringer” to include only those who willfully infringe copyrights. To the contrary, the Seventh Circuit has suggested that the term covers users of file-sharing services who are “ignorant or more commonly disdainful of copyright.” See In re Aimster Copyright Litig., 334 F.3d 643, 645 (7th Cir. 2003).
Our view of what Congress meant by the term “repeat infringer” leads us to conclude that the District Court improperly granted summary judgment. Prior to trial, there was clearly enough disputed evidence relating to MP3tunes’s policy regarding infringers to conclude that summary judgment was inappropriate. To show that it reasonably implemented such a policy, MP3tunes proffered evidence at the summary judgment stage that it terminated 153 users who shared locker passwords. In response, though, the plaintiffs demonstrated that MP3tunes did not even try to connect known infringing activity of which it became aware through takedown notices to users who repeatedly sideloaded files and created links to that infringing content in the sideload.com index.
Furthermore, the plaintiffs presented evidence that MP3tunes executives were encouraged to and did personally sideload songs from blatantly infringing websites. The same executives made the songs available to sideload.com users. There was also evidence that MP3tunes was capable of cataloging the sideloads of each MP3tunes user. A jury could reasonably infer from that evidence that MP3tunes actually knew of specific repeat infringers and failed to take action.
A reasonable jury alternatively could have determined that MP3tunes consciously avoided knowing about specific repeat infringers using its services, even though the infringement was rampant and obvious. In Viacom, we held that “the willful blindness doctrine may be applied, in appropriate circumstances, to demonstrate knowledge or awareness of specific instances of infringement under the DMCA.” Thus, at trial the plaintiffs could prevail by demonstrating that MP3tunes’s failure to track users who created links to infringing content identified on takedown notices or who copied files from those links evidenced its willful blindness to the repeat infringing activity of its users.
Our conclusion that the District Court improperly granted partial summary judgment to MP3tunes on the basis of its policy regarding “repeat infringers” is not inconsistent with the DMCA’s provision declaring that safe harbor protection cannot be conditioned on “a service provider monitoring its service or affirmatively seeking facts indicating infringing activity.” (17 U.S.C. § 512(m)(1)). Based on the available evidence, a reasonable jury could have concluded that it was reasonable for MP3tunes to track users who repeatedly created links to infringing content in the sideload.com index or who copied files from those links. See Aimster, 334 F.3d at 655 (“The common element of [DMCA] safe harbors is that the service provider must do what it can reasonably be asked to do to prevent the use of its service by ‘repeat infringers.’ ”). After all, MP3tunes had already tracked and removed 153 users “who allowed others to access their lockers and copy music files without authorization”; by comparison, requiring MP3tunes to extend that policy to users who sideloaded infringing content may not be an unreasonably burdensome request. Furthermore, doing so would not require MP3tunes to “monitor” or “affirmatively seek facts” about infringing activity in a manner inconsistent with § 512(m)(1) because it already had adequate information at its disposal in the form of takedown notices provided by EMI as to which links were allegedly infringing. MP3tunes would simply have had to make use of information already within its possession and connect that information to known users. While the defendants could yet make the case at trial that it was unreasonable under the circumstances to ask MP3tunes to identify users who repeatedly infringed plaintiffs’ copyrights by sideloading music files, no evidence available at the summary judgment stage compelled that conclusion as a matter of law.
For these reasons, we vacate the District Court’s grant of summary judgment and remand for further proceedings.
B. Red–Flag Knowledge and Willful Blindness
We turn next to the District Court’s determination that the jury’s finding of red-flag knowledge or willful blindness with respect to certain categories of songs was wrong as a matter of law. We start with the proposition that even if a service provider has a reasonably implemented repeat infringer policy, it relinquishes the DMCA’s safe harbor if it, first, has “actual knowledge that the material or an activity using the material on the system or network is infringing” or “in the absence of such actual knowledge, is [ ] aware of facts or circumstances from which infringing activity is apparent,” and second, “upon obtaining such knowledge or awareness, [does not] act[ ] expeditiously to remove, or disable access to, the material.” 17 U.S.C. § 512(c)(1)(A); see also 17 U.S.C. § 512(d)(1). At trial, the plaintiffs contended that MP3tunes was “aware of facts or circumstances from which infringing activity was apparent”—or in other words had “red-flag knowledge” or willful blindness with respect to several categories of songs. The jury found that MP3tunes had knowledge as to four categories of files: (1) those stored on domains identified in takedown notices as having ten or more infringing files; (2) sideloads of MP3s before January 2007; (3) certain sideloads by MP3tunes executives; and (4) works by the Beatles. The District Court upheld the jury’s finding of red-flag knowledge with respect to certain songs and subsets of songs but granted the defendants judgment as a matter of law as to two categories of songs that are the subject of our review: MP3s from major labels issued before 2007, and all songs by the Beatles.
We have already explained that the DMCA does not impose “an amorphous obligation to take commercially reasonable steps in response to a generalized awareness of infringement.” Viacom, 676 F.3d at 31. Accordingly, “[o]n the issue of disqualifying knowledge … the burden falls on the copyright owner to demonstrate that the service provider acquired knowledge of the infringement, or of facts and circumstances from which infringing activity was obvious, and failed to promptly take down the infringing matter, thus forfeiting its right to the safe harbor.” Capitol Records, LLC v. Vimeo, LLC, 826 F.3d 78, 95 (2d Cir. 2016). In other words, a copyright owner must point to a defendant’s “actual knowledge or awareness of facts or circumstances that indicate specific and identifiable instances of infringement.”
With this principle in mind, we conclude that the trial evidence in this case, viewed in the light most favorable to the plaintiffs, showed that MP3tunes and Robertson knew that major music labels generally had not even authorized their music to be distributed in the format most widely available on sideload.com, let alone authorized it to be shared on the internet. In particular, Robertson apparently knew that major record labels had not offered songs in MP3 format until 2007. In January 2007, in connection with MP3tunes’s MP3 sale model, for example, Robertson admitted that “popular acts have never before sold tracks in MP3 formats.” With respect to MP3s sideloaded before 2007, therefore, the jury reasonably could have concluded that MP3tunes and Robertson were aware of “facts and circumstances that make infringement obvious.”
What prompted the District Court to conclude otherwise? In granting judgment as a matter of law to the defendants on this issue, the District Court explained that barring MP3tunes from the DMCA safe harbor “would require Defendants to actively conduct routine searches and eliminate material likely to be infringing.” It therefore understandably concluded that imposing such a duty clashed with the DMCA’s “express disavowal of a duty to affirmatively monitor.” Under the circumstances of this case, we respectfully disagree with the District Court’s assessment, primarily for two reasons.
First, the jury was clearly instructed, and we presume it understood, that MP3tunes had no continuing, affirmative duty to monitor its servers for infringement. The jury could comply with that instruction and still find that MP3tunes was required to disable access to pre–2007 songs by “act[ing] expeditiously to remove, or disable access to” the pre-2007 songs “upon obtaining such knowledge or awareness.” 17 U.S.C. § 512(c)(1)(A)(iii). There was evidence at trial that MP3tunes could disable access. Indeed, an expert testified that searching through libraries of MP3 songs was a common function of MP3tunes’s business. The jury was therefore permitted to conclude that a time-limited, targeted duty—even if encompassing a large number of songs—does not give rise to an “amorphous” duty to monitor in contravention of the DMCA. Viacom, 676 F.3d at 31; see also id. at 34 (suggesting that a reasonable jury could find red-flag knowledge with respect to groups of clips). The same is true of the Beatles songs. The jury heard evidence that Robertson knew there had been no legal online distribution of Beatles tracks before 2010, other than one track used within a video game. Robertson further admitted that he authored a 2009 e-mail that showed he was aware of the plaintiffs’ position that “[the] Beatles have never authorized their songs to be available digitally.” And MP3tunes was made aware through user emails that Beatles songs such as “Strawberry Fields Forever” were on sideload.com’s index. The jury could have reasonably concluded that MP3tunes had red-flag knowledge of, or was willfully blind to, the infringing nature of the Beatles tracks on its servers and failed to “act[ ] expeditiously” to remove them.
Second, the jury could reasonably have found that MP3tunes conceived of and was designed to facilitate infringement based in part on evidence presented at trial that MP3tunes “actively encourage[ed] infringement” and that Robertson and MP3tunes executives “personally used [sideload.com] to download infringing material.” Although such evidence might not alone support a separate finding of red-flag knowledge or willful blindness as to users, the jury could certainly rely on it in deciding whether MP3tunes was entitled to the DMCA safe harbor, see Fung, 710 F.3d at 1040 (holding that “aspects of the inducing behavior that give rise to liability are relevant to the operation of some of the DMCA safe harbors and can, in some circumstances, preclude their application”). Indeed, the jury could reasonably have understood Robertson to have admitted on cross-examination that sideload.com “was premised on the notion that everything that was on the internet that was not locked down could be sideloaded into the site.” And in editing sideload.com’s Frequently Asked Questions (“FAQs”), Robertson emphasized that the site should tell users that its music is “legal to download” because “[s]ideload.com does not store any music, but rather links to files publicly available [in] other places on the net.”
For these reasons, we reverse the District Court’s ruling vacating the jury verdict with respect to red-flag knowledge and willful blindness for pre-2007 MP3s and Beatles songs.
__________
Check Your Understanding – EMI Christian Music
Question 1. True or false: The DMCA explicitly requires service providers to affirmatively monitor their users for infringement.
Question 2. What does the DMCA require Internet service providers to do with respect to “repeat infringers” in order to remain eligible for the safe harbor?
Some things to consider when reading Ventura Content:
- This is another decision pertaining to the § 512(c) safe harbor for “infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider.”
- The steps the defendant took to remain eligible for the safe harbor, and how close he came to losing it. What is the lesson for ISP operators?
- The courts discussion and interpretation of the phrase “by reason of storage at the direction of the user.”
- The court’s application of the requirement that an ISP implement a policy “reasonably” for terminating repeat infringers to this particular defendant, which was essentially a one-man operation.
- The court’s comparison of the facts of this case with UMG, Mavrix, and Fung.
Ventura Content, Ltd. v. Motherless, Inc.
885 F.3d 597 (9th Cir. 2018)
KLEINFELD, Senior Circuit Judge:
We address the safe harbor provision in the Digital Millennium Copyright Act and conclude that the defendants are entitled to safe harbor.
FACTS
This case was decided on summary judgment. Joshua Lange, the named defendant, owns, operates, and is the sole employee of his internet site, Motherless.com. The site contains over 12.6 million mostly pornographic pictures and video clips. The content generally has been uploaded by the site’s users, and the uploaders may or may not have created the material. Motherless stores the content on servers that Lange owns. In 2011, the website had nearly 750,000 active users and about 611,000 visits daily.
The Terms of Use posted on the site provide a “partial list of content that is illegal or prohibited,” such as child pornography, bestiality, and copyright-infringing material. The Terms prohibit posting copyrighted material without the prior written consent of the copyright owner, and they invite takedown notices for infringing material. The website gives directions for emailing takedown notices. Motherless also uses a software program that provides copyright owners with a link and password so that they can directly delete infringing material themselves, without having to send a takedown notice to Lange.
Lange explained at his deposition that he and an independent contractor review all the pictures and videos before they are displayed on the site. Lange uses software that generates a thumbnail of each picture, and five thumbnails of each video clip at the 20%, 40%, 60%, 80%, and 100% time points in the clip (e.g., for a two minute clip, at 24, 48, 72, 96, and 120 seconds into the clip). Lange or his contractor look at each thumbnail for “obvious signs of child pornography, copyright notices, watermarks, and any other information that would indicate that the [material] contains illegal content or violates” the Terms of Use. Lange spends three to six hours a day, seven days a week, looking at the uploads, and he estimates that he reviews between 30,000 to 40,000 images per day. He looks at about 80 thumbnails per minute to keep up with the volume of uploads. He deletes any violating material that he or his contractor spot. Whenever he finds child pornography, he contacts the National Organization of Missing and Exploited Children so that criminal action can be instigated against the uploader.
Lange personally examines all copyright infringement notices, whether DMCA-compliant or not, and deletes any infringing content that he can find. He locates infringing content using the URL, that is, the web address that appears at the top of the screen when an image or clip is on the screen. The complainant identifies the material by the URL and Lange deletes it as quickly as he can, ordinarily within a day or two. He also sends an email to the user who uploaded the video or picture, notifying him that the uploaded material has been deleted. Motherless uses software to prevent users from re-uploading previously deleted material. Since 2008, Motherless has received over 3,500 takedown notices. Lange has deleted over 4.5 million pictures and videos for violating Motherless’s Terms of Use and estimates that 4% to 6% of the deleted files were for copyright infringement.
Motherless does not have a written policy instructing its employees on when to expel repeat infringers; there are no employees to instruct. Lange personally terminates repeat infringers; the independent contractor does not terminate repeat infringers. Termination is a matter of Lange’s judgment. He considers the following factors in deciding whether to terminate a repeat infringer: (1) the volume of complaints; (2) the amount of linked content in the complaints; (3) the timespan between notices; (4) the length of time the alleged infringer’s account had been active; (5) the amount of total content the account has; (6) whether the user is maliciously and intentionally uploading infringing content or uploading content without knowing the source; and (7) whether the takedown notices were DMCA-compliant. Between 2008 and 2011, Lange terminated over 33,000 user accounts for violating the website’s Terms of Use. Lange estimated that he terminated about 4% to 6% of these users for possible copyright infringement, which would be between 1,320 and 1,980 users.
Ventura Content, the plaintiff, creates and distributes pornographic movies. Ventura found 33 clips on Motherless from movies it had created and had not licensed to Motherless. The infringing clips were anywhere from 20 seconds to 46 minutes long, mostly 15 minutes or longer. It is undisputed that the clips infringed on Ventura’s copyright.
All the infringing clips were segments of Ventura movies, not merely pictures, and not the full movie. None of the clips contained anything to indicate that Ventura owned the copyright. A few had watermarks naming other websites, which appear to be other pornography aggregators, but there were no Ventura watermarks, credits, or other pieces of information suggesting in any way that Ventura owned the copyright. These clips were visited 31,400 times during the 20 months they were posted on Motherless. During this time, Motherless received about 600,000 visits per day, so the views of the Ventura clips were a minuscule proportion of the total views on Motherless.
Eight users uploaded the 33 infringing clips. Lange terminated two of these users by 2012 (after this litigation began), one for repeat copyright infringement. There is no evidence to show that whoever uploaded the Ventura material got any credits or other compensation for these uploads. Lange does not remember reviewing any of these videos. Ventura did not send DMCA notices or any other sort of takedown notice for the infringing material. Nor did Ventura remove the material itself, as Motherless’s software link enabled it to do. Ventura’s first notice of infringement to Motherless was this lawsuit.
After Lange was served with the complaint in this case, he asked Ventura to send him the URLs for the infringing clips so that he could delete them. Ventura did not respond the first time Lange asked for the URLs, so Lange asked again. Ventura answered his follow-up request. On the day that Ventura gave Lange the URLs, Lange deleted the infringing clips.
Ventura sued Motherless and Lange for copyright infringement under federal law and for unfair business practices under California law. Ventura sought damages and an injunction, but the injunction claim became moot when Lange deleted all the infringing clips. The district court granted summary judgment in favor of Motherless and Lange on the federal copyright claim.
ANALYSIS
I. Safe Harbor
The Digital Millennium Copyright Act places the burden of policing copyright infringement on the copyright owner, not on the person or firm storing and hosting the material. It is undisputed that Ventura owned the copyrights to the 33 clips that were stored and displayed by Motherless.
The safe harbor clause at issue in this case, 17 U.S.C. § 512(c), provides as follows:
(1) IN GENERAL.—A service provider shall not be liable for monetary relief … for infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider, if the service provider—
(A) (i) does not have actual knowledge that the material or an activity using the material on the system or network is infringing;
(ii) in the absence of such actual knowledge, is not aware of facts or circumstances from which infringing activity is apparent; or
(iii) upon obtaining such knowledge or awareness, acts expeditiously to remove, or disable access to, the material;
(B) does not receive a financial benefit directly attributable to the infringing activity, in a case in which the service provider has the right and ability to control such activity; and
(C) upon notification of claimed infringement as described in paragraph (3), responds expeditiously to remove, or disable access to, the material that is claimed to be infringing or to be the subject of infringing activity.9
Thus for a service provider to get safe harbor protection despite its infringement, it must not know of the infringement, and the infringement cannot be apparent. It must also take down or prevent access to the infringing material as soon as it learns about it or receives a DMCA notice. And it must not directly benefit financially from the infringement in situations when it can control the activity.
There is an additional condition on safe harbor eligibility: the service provider must have a policy to terminate users who repeatedly infringe on copyrights, and it must implement that policy reasonably. The statute setting this condition, 17 U.S.C. § 512(i), reads as follows:
CONDITIONS FOR ELIGIBILITY.—
(1) ACCOMMODATION OF TECHNOLOGY.—The limitations on liability established by this section shall apply to a service provider only if the service provider—
(A) has adopted and reasonably implemented, and informs subscribers and account holders of the service provider’s system or network of, a policy that provides for the termination in appropriate circumstances of subscribers and account holders of the service provider’s system or network who are repeat infringers; and
(B) accommodates and does not interfere with standard technical measures.
The overall scheme is plain enough at a superficial level. A service provider must delete or disable access to known or apparent infringing material, as well as material for which he receives a statutorily compliant takedown notice. He must also terminate repeat infringers when appropriate. The copyright owner, not the service provider, has the burden of policing infringement. But the service provider, to maintain its shield, must respond expeditiously and effectively to the policing. If these conditions are met, the service provider will not be financially liable for infringing material on his website. The details, of course, get complicated, and we must address those complications.
A. “By reason of the storage at the direction of a user”
Section 512(c) says that, subject to additional conditions discussed below, a service provider will not be liable “for infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider.”
Ventura argues that Lange is not entirely passive because he screens out child pornography, bestiality, and copyright infringement that he spots. The argument is that by screening out this material, Motherless effectively directs what is posted instead of enabling posting “at the direction of a user.” But Ventura cites no authority for the unlikely proposition that screening out illegal material eliminates the safe harbor shield. Indeed, section 512(m) says that the law should not be construed to eliminate the safe harbor because a service provider monitors for infringement or disables access to material where the conduct depicted is prohibited by law. We find it counterintuitive, to put it mildly, to imagine that Congress intended to deprive a website of the safe harbor because it screened out child pornography and bestiality rather than displaying it. Instead, we read section 512(m) to say that Congress expressly provided that such screening does not deprive a website of safe harbor protection.
Ventura also argues that because Motherless groups together the tagged videos and pictures so that users can find what they want, it is Motherless, rather than the user, who directs the “storage.” But Lange testified, and Ventura does not dispute, that his editorial principle is as announced on the site: “anything legal stays.” Ventura merely argues that this case can be distinguished from opinions which applied the safe harbor to sites that screen and alter content.
Our controlling case is UMG Recordings, Inc. v. Shelter Capital Partners, LLC.4 There, we addressed whether a website that enabled sharing music videos, some of which turned out to be infringing, was entitled to safe harbor. The videos in UMG were not just stored, as one might store family photographs on a “cloud” service such as iCloud, Dropbox, or Google Drive. Users uploaded material and watched and listened to videos and songs. Some of the music was infringing. We held in UMG that the phrase “by reason of the storage at the direction of a user” covers more than “mere electronic storage lockers.” It allows service providers to perform access-facilitating processes such as breaking up the files for faster viewing and converting them to a Flash format.
As in UMG, Motherless’s users, not the website, decide what to upload and what file names and tags to use. Our holding in UMG disposes of the argument that altering the file format to make it accessible before posting, and enabling users to apply search tags to uploads, takes the posting of the content out of the “at the direction of a user” definition. It also disposes of the argument that being anything more than an electronic storage locker, such as by facilitating user access to files that other users posted, deprives the website of safe harbor protection.
Ventura argues that by using software to highlight the “Most Popular” material, and by giving credits to users who post the most popular material, Motherless is posting at its own direction rather than hosting material posted at the direction of the user. This argument is inconsistent with our holding in UMG. It is also inconsistent with the meaning of the words “at the direction of the user.” The users post what they post, popular or not. Motherless does not screen out material for relatively low popularity, and of course most postings do not fall within the “Most Popular” category. Yet there they are, up on the site, because the users put them there.
We recently addressed the phrase “by reason of storage at the direction of the user” in Mavrix Photographs, LLC v. LiveJournal, Inc.5 The website in Mavrix was not entitled to summary judgment on the safe harbor issue because there was a genuine issue of fact as to whether the storage of material on the site was at the direction of the site or at the direction of its users. The Mavrix website used moderators to review user submissions for substance. It published only those submissions that, in the moderators’ judgment, were “relevant to new and exciting celebrity news.” We remanded because genuine issues of material fact remained as to “whether the moderators were LiveJournal’s agents.” We restated in Mavrix what we had held in UMG: “Infringing material is stored at the direction of the user if the service provider played no role in making that infringing material accessible on its site or if the service provider carried out activities that were ‘narrowly directed’ towards enhancing the accessibility of the posts.” And we further noted that section 512(m) of the statute expressly provided that deleting unlawful material did not deprive the site of safe harbor protection.
The case before us falls within UMG, not Mavrix. The moderators in Mavrix directed posting only if they thought the user-submitted material was “new and exciting celebrity news.” Lange and his contractor do not review whether the pornography submitted by users is “new and exciting” or meets any other discretionary standards. The Motherless rule is “anything legal stays.” Lange does not exercise judgment in what to host. His editing is limited to the kind protected by section 512(m), screening out illegal material.
Although UMG compels our holding, we also note that our sister circuits agree with the critical point that “storage at the direction of a user” affords safe harbor protection to sites where users can look at other users’ uploads, not just to what UMG called “electronic storage lockers.”27 The Second Circuit ruled in Viacom International, Inc. v. YouTube, Inc.6 that YouTube was entitled to safe harbor—even though it converted user-submitted videos into a standard display format and used an algorithm to suggest related videos—because “to exclude these automated functions from the safe harbor would eviscerate the protection afforded to service providers by § 512(c).” Likewise, the Fourth Circuit held in CoStar Group, Inc. v. LoopNet, Inc. that a real estate listing website that allowed subscribers to post listings was not liable for copyright infringement even though an employee cursorily reviewed the photographs for infringing material. The CoStar majority analogized the service provider to an owner of a traditional copy machine “who has stationed a guard by the door to turn away customers who are attempting to duplicate clearly copyrighted works.” And the Tenth Circuit held in BWP Media USA, Inc. v. Clarity Digital Group, LLC that a news site that relied on user-generated content was entitled to safe harbor even though it instructed users on topics to write about and suggested that users include pictures or slide shows with their articles. Citing to UMG, the Tenth Circuit explained that “if the infringing content has merely gone through a screening or automated process, the [service provider] will generally benefit from the safe harbor’s protection.”
Because the users, not Motherless, decided what to post—except for Lange’s exclusion of illegal material and his original upload when he created the website—the material, including Ventura’s, was “posted at the direction of users.”
B. Knowledge and Expeditious Takedown
Though the statutory scheme places the burden of policing infringement on the copyright owner, the scheme does not allow a website owner to avoid responsibility for knowingly selling pirated material by deleting a particular posting only when he gets caught. Instead, the statute excludes blatant pirates from the safe harbor by requiring that a service provider:
(i) does not have actual knowledge that the material or an activity using the material on the system or network is infringing;
(ii) in the absence of such actual knowledge, is not aware of facts or circumstances from which infringing activity is apparent; or
(iii) upon obtaining such knowledge or awareness, acts expeditiously to remove, or disable access to, the material.
If the website provider actually knows that the material for which relief is sought is infringing, or if the infringement is “apparent,” he remains liable if he does not expeditiously remove the material upon gaining knowledge.
i. Actual Knowledge
Ventura and its expert argue that Lange must have had actual knowledge that the Ventura clips infringed on its copyright because they appeared to be professionally produced and because a few had watermarks. That argument is unavailing.
Ventura argues that Motherless had to know the clips were infringing because, it claims, the high quality of the videos showed professional production. But the conclusion does not follow from the premise. Professionally created work often is posted online to publicize and attract business for the creator. Amateurs often do professional quality work in artistic endeavors, and amateurs are no less entitled to copyright protection than professionals, so it is not apparent why professionalism matters.
Ventura could have indicated its ownership by watermarking its videos as copyrighted, but it did not. And Ventura could have notified Motherless that the clips infringed on its copyright when it discovered them on Motherless’s site, but it did not. Ventura’s “decision to forgo the DMCA notice protocol stripped it of the most powerful evidence of a service provider’s knowledge—actual notice of infringement from the copyright holder.”7 If Ventura had notified Motherless about these 33 infringing videos before filing this lawsuit and Motherless had not taken them down, then Motherless would have lost its safe harbor. On the facts of this record, however, Ventura did not establish a genuine issue of fact as to actual knowledge. The statutory phrase “actual knowledge” means what it says: knowledge that is actual, not merely a possible inference from ambiguous circumstances.
ii. Apparent Knowledge
Actual knowledge is not necessary to deprive an infringer of safe harbor. Motherless would also lose its safe harbor if it was “aware of facts or circumstances from which infringing activity is apparent” and did not “act[ ] expeditiously to remove, or disable access to, the material.” This is different from actual knowledge because instead of looking at subjective thoughts, we look at objective facts and circumstances from which the specific infringement would be obvious to a reasonable person. The statutory term “apparent” is often described, in the cases and secondary literature, as “red flag” knowledge. The sports metaphor is no more helpful than the statutory word “apparent,” and we use the words interchangeably.
Ventura’s arguments for “apparent” awareness are similar to its arguments for actual knowledge. And the same reasons for absence of knowledge apply. There is nothing about the Ventura clips that would make infringement apparent. That is not to say that Motherless did not know that infringement was probably occurring on its website. It is hard to imagine that a site with 12.6 million pictures and video clips uploaded by users would not contain some material that users had uploaded without authorization. It is also hard to imagine that Lange and his contractor would have spotted all the infringing videos with the few seconds of viewing they gave to each one.
Nevertheless, we held in UMG that hosting material capable of copyright protection, with the general knowledge that the site could be used to share infringing material, is not enough to impute knowledge. The material in UMG was much more likely to arouse awareness of infringement than the material in this case, because it included music videos by well-known celebrities like 50 Cent, Avril Lavigne, and Britney Spears. We held that this sort of knowledge was not enough to amount to red flag knowledge.
Similarly, in Capitol Records, LLC v. Vimeo, LLC, the Second Circuit addressed whether a service provider may be found to have apparent knowledge because it relies on mass uploading by users. Its reasoning is instructive. The service provider in Capitol Records was Vimeo, which operates a website that enables members to post videos that they created. As of 2012, Vimeo had more than 31 million videos and 12.3 million registered users. Nearly 43,000 videos were uploaded to Vimeo daily. Capitol Records sued Vimeo for copyright infringement because 199 videos on the website contained recordings to which Capitol Records held the copyright. The Second Circuit explained that the copyright holder must demonstrate that the service provider had actual knowledge of facts “that would make the specific infringement claimed objectively obvious to a reasonable person.” Capitol Records further explained that “suspicion of infringement” is not the same as “facts making infringement obvious.” Requiring service providers to investigate potential copyright infringement whenever they were suspicious would undermine “an important part of the compromise embodied in the safe harbor.”
We agree. The copyright owner must show knowledge, actual or red flag, for the videos that infringed its copyright and are the subject of its claim. And for red flag knowledge, infringement must be apparent, not merely suspicious. Congress used the word “apparent,” not “suspicious” or some equivalent. Ventura, not Lange, is in charge of policing Motherless for its copyrighted material. Congress could have put the burden of policing infringement in suspicious circumstances on the provider, but it instead put it on the copyright holder.
Because the facts and circumstances from which a reasonable person might suspect infringement were much more substantial in UMG than in this case, and because there we held that the infringement was not “apparent,” we must reach the same conclusion here. As UMG implies, and as the Second Circuit in Capitol Records expressly stated, even if it were obvious to a reasonable person that some of the material on the site must be infringing, that is not enough to lose the safe harbor. It must be obvious that the particular material that is the subject of the claim is infringing. Here, it would not be obvious to a reasonable person that the clips excerpted from Ventura movies were infringing.
Ventura argues that we should infer apparent knowledge under Columbia Pictures Industries, Inc. v. Fung. Fung’s website enabled users to download popular movies and television shows, not just clips but entire movies. For example, users downloaded over 1.5 million copies of the James Bond movie Casino Royale. The website included categories such as “Top 20 TV Shows” and “Top 20 Movies,” so it was obvious that using it would enable the user to get this obviously infringing content in its entirety. Fung solicited users to upload and download copyrighted material and assisted those seeking to watch copyrighted material, including helping downloaders burn DVDs of the infringing material. We held that Fung had apparent knowledge, because “[t]he material in question was sufficiently current and well-known that it would have been objectively obvious to a reasonable person that the material solicited and assisted was both copyrighted and not licensed to random members of the public.”
This case is very much like UMG and not at all similar to Fung. In Fung, the site marketed itself as a pirate site for free access to feature movies and top television shows, and no one could mistake the material on it for anything but infringing material. Fung had complete, long movies, but Motherless limited uploads to 500 megabytes, which would be around half or three-quarters of an hour at standard density, and much less at high density. The Ventura clips had no indication that Ventura owned the copyright—or was associated with the videos at all. Fung had “current and well-known” material, like Casino Royale. Whoever the actors in the Ventura material may have been, they are not as famous as the actors who have played James Bond. No one could mistake Casino Royale for a couple of amateurs filming their own activities and purposely posting them for exhibition, but an ordinary person could mistake the Ventura clips for just that.
In Fung, we noted that “the record is replete with instances of Fung actively encouraging infringement, by urging his users to both upload and download particular copyrighted works.” Lange did not do that. His posted Terms of Use prohibited posting copyrighted material without prior written consent from the copyright owner, and he invited takedown notices for infringing material. While such posted notices language could be merely for appearances sake if it were not followed by action, Lange estimates that he has deleted over 180,000 videos and pictures for copyright infringement. He has removed an estimated 1,320 to 1,980 users from the site for repeated copyrighted infringement. His software stops users from re-uploading previously deleted material. Fung’s was fairly explicitly a pirate website. Motherless, though, appears to be managing the website to make money while avoiding legal trouble from users posting child pornography, bestiality, or copyright infringing material.
Lastly, Ventura makes the policy argument that “[i]t is exceedingly difficult for [Ventura]—or any adult Web site, for that matter—to convince customers to pay for content that is readily available for free on the adult tube sites” such as Motherless. That may be so, but Congress, not judges, makes the policy decision on whether to offer a safe harbor from suit.
iii. Expeditious Takedown
An additional requirement for the accidental infringer’s safe harbor relief is expeditious removal of the infringing material once there is actual or red flag notice of the infringement. The statutory wording is that “upon obtaining such knowledge or awareness,” the service provider must “act[ ] expeditiously to remove, or disable access to, the material.” To trigger the expeditious removal requirement, a copyright owner’s notification must substantially comply with the requirements of subsection (c)(3)(A) of the safe harbor statute. Among other things, the notification must identify the infringing material with “information reasonably sufficient to permit the service provider to locate the material.”
In this case, the infringing videos had no Ventura identification, and the site had more than a half-million videos, so as a practical matter what Motherless needed to remove them was a URL for each. Ventura did not send Motherless a statutory notification before filing suit. When Lange was served with Ventura’s complaint, he asked Ventura to provide him with the URLs to the infringing clips so that he could delete them. Ventura did not initially respond. Subsequently, Ventura provided the URLs after Lange followed up on his initial request. Lange deleted the 33 infringing clips the same day. That satisfied the “responds expeditiously to remove” requirement.
C. Right and Ability to Control
Even if subsection (c)(1)(A) is satisfied (no actual or red flag knowledge, expeditious removal), a service provider still loses its safe harbor under subsection (c)(1)(B) if it receives “a financial benefit directly attributable to the infringing activity, in a case in which the service provider has the right and ability to control such activity.” This raises two questions: did Motherless have “the right and ability to control” the infringing activity, and if so, did it receive a financial benefit “directly attributable to the infringing activity”?
Motherless certainly had the physical ability to control any and all infringing activity. Lange could take down all of Motherless’s content, infringing or not, and bar any uploads, infringing or not. We have held, however, that the right and ability to control involves something more than the ability to remove or block access to materials posted on a service provider’s website. To have the right and ability to control, a service provider must be able to exert “substantial influence” on its users’ activities.
We held in UMG that the service provider did not have the “ability to control.” In Fung, it did. This case is like UMG and not like Fung. Nothing in the record suggests that Motherless told its users what to upload. Its homepage welcomed users to “a moral free zone where anything legal stays.” It did not curate uploaded content in any meaningful way, nor did it reject unpopular groups or content. Motherless deleted only user-created groups that contained little or no content, and it started deleting bestiality content due to legality issues raised by European advertisers.
Nor was there any evidence that Motherless received “a financial benefit directly attributable to the infringing activity.” Unlike the site in Fung, Motherless did not advertise itself as a place to get pirated materials. Of course, the more pornography Motherless had, the more users it would attract, and more views would lead to more advertising revenue. The words “the” and “directly” in the statute, though, must mean that some revenue has to be distinctly attributable to the infringing material at issue. There is no evidence that Motherless made any money directly from the Ventura clips.
D. Repeat Infringer Termination
So far, we have examined the specifics of the safe harbor as applied to Ventura’s movie clips. Ventura did not submit cognizable evidence establishing a genuine issue of fact as to whether Motherless was entitled to safe harbor. The evidence is uncontradicted that Motherless did not know, nor was it apparent, that its site included infringing Ventura clips. Motherless immediately removed them on the day that Ventura gave Motherless enough information to do so. And Motherless did not control what users uploaded. These conditions are necessary to enjoy the safe harbor. However, they are not sufficient.
Basically, subsection (c) of the safe harbor provision aims at individual infringements, not the service as a whole. It uses the phrase “the material”—that is, the material for which an infringement remedy is sought—in the context of setting out what a service provider needs to do to avoid liability for the infringement of the copyrighted material at issue. Our sister circuit and we both read it this way.8 If subsection (c) were read to apply to all the material on the website, instead of the material for which a remedy was sought by the victim of infringement, then no large site would be protected by the safe harbor. It is unimaginable that any website with hundreds of thousands or millions of user uploads could successfully screen out all of the copyright infringing uploads, or even all of the uploads where infringement was apparent.
But Congress promulgated subsection (i) to limit the eligibility for safe harbor treatment. Even if a website deletes infringing material as soon as it learns about it, the safe harbor is unavailable unless the site has a policy of excluding repeat infringers. This ineligibility provision is a prophylactic against future acts of infringement by actors whose past conduct renders them suspect.
This repeat infringer policy requirement does not focus on the particular infringement at issue. Instead, subsection (i) bars use of the subsection (c) safe harbor unless the service provider adopts and “reasonably” implements a policy of terminating repeat infringers in “appropriate” circumstances:
(1) ACCOMMODATION OF TECHNOLOGY.—The limitations on liability established by this section shall apply to a service provider only if the service provider—
(A) has adopted and reasonably implemented, and informs subscribers and account holders of the service provider’s system or network of, a policy that provides for the termination in appropriate circumstances of subscribers and account holders of the service provider’s system or network who are repeat infringers; and
(B) accommodates and does not interfere with standard technical measures.
Unlike subsection (c), subsection (i) addresses how the site is generally managed, not just how the site responds to notice of a particular infringement. Without subsection (i), an unscrupulous website might take down infringing material as soon as it received a proper takedown notice identifying it, yet still operate as a pirate site. Subsection (i) obliges the provider to exclude repeat infringers, subject to its qualifications: “reasonably” and “in appropriate circumstances.” In this case, subsection (i) means that if Motherless did not reasonably implement a policy of terminating in appropriate circumstances users who were repeat infringers, then innocence in hosting Ventura’s works and promptness in removing them once notified would not shield Motherless from infringement remedies.
The “standard technical measures” referenced in subsection (i)(1)(B) enable copyright owners to establish some technical means so that service providers can spot and exclude infringing material without substantial expense. One can imagine a digital version of the old c in a circle (©) automatically triggering the uploading software to exclude material so marked by the copyright owner. But subsection (i)(1)(B) is not at issue in this case. The evidence establishes, without any genuine issue of fact, that Ventura did not in any way mark its material so that infringement could be spotted and the material excluded by some standard technical measure.
However, the inapplicability of subsection (B) to this case does not free Motherless from the burden of subsection (A). The service provider must satisfy both. Motherless has a written policy of excluding infringing material, stated on its membership sign-up page:
• In connection with User-Submitted Content, you affirm, represent, and/or warrant that: you won or have the necessary licenses, rights, consents and permissions to use and authorize [Motherless] to use all … copyright … rights in and to any and all User-Submitted Content to enable inclusion and use of the User-Submitted Content in the manner contemplated by the [Motherless] website and these Terms of Use.
• [Motherless] and its administrators reserve the right (but not the obligation) in their sole discretion to refuse, delete, move or edit any and all Content that it deems is in violation of the law (including … copyright law)….
• A partial list of content that is illegal or prohibited includes content that … Promotes an illegal or unauthorized copy of another’s copyrighted work, such as pirated computer programs or links to them, or providing information to circumvent manufacturer-installed copy-protect devices, or providing pirated music or links to pirated music files….
• You agree that you will not post, or otherwise distribute or facilitate distribution of any Content that … infringes on any … copyright … of any party….
• You may not post, distribute, or reproduce in any way, any copyrighted material … without obtaining the prior written consent of the owner of such proprietary rights or otherwise have a valid basis under the law, including “fair use.”
And Motherless has a written policy of terminating repeat infringers. On its page entitled “DMCA Notice & Takedown Policy and Procedures,” Motherless said that “[it] is the firm policy of the [site] to terminate the account of repeat copyright infringers, when appropriate.”
Lange described how he applies Motherless’s repeat infringer policy in his deposition testimony. He testified that he excludes infringing material by looking for an identifying watermark in the corner, the usual way owners identify their copyrighted material. If he receives a DMCA takedown notice (the form designated in subsection (c)(3)(A) ), he also uses “hashing” software so that copies of the image or clip will be removed and will be screened out if anyone tries to post them again. Ordinarily, he will not terminate a user because of one takedown notice, but he will if there are two or more, which is to say, “repeated” instances of infringement. Before removing a user, Lange considers multiple factors, including the number of complaints arising from the user’s uploads, the amount of infringing content in the complaint he received, and whether he thinks the user had maliciously or intentionally uploaded infringing content.
Perfect 10, Inc. v. CCBill LLC holds that a service provider implements a policy if it has a working notification system, a procedure for dealing with DMCA-compliant notifications, and if it does not actively prevent copyright owners from collecting information needed to issue such notifications. The implementation is reasonable if, under appropriate circumstances, the service provider terminates users who repeatedly or blatantly infringe copyright.
Various factors may bear on whether a service provider has adopted and reasonably implemented its policy for terminating, in appropriate circumstances, repeat infringers. Certain factors work in favor of the service provider, including: a DMCA log, as discussed in CCBill; blocking a subscriber’s name and email address from uploads; putting email addresses from terminated accounts on a banned list; and prohibiting a banned user from reopening a terminated account. Other factors cut against the service provider, including: changing the email address to which takedown notices are sent without providing notice of the change; participating in copyright infringement: allowing terminated users to rejoin the site; and refusing to terminate known repeat infringers. Congress did not require that, to be eligible for safe harbor, a provider must maintain a logbook of infringers which it consults whenever it receives a DMCA notice. Congress required that the provider reasonably implement a policy of terminating repeat infringers, and the use of such a logbook and procedure would be good evidence that it did.
We conclude that on this record, there was no triable issue of fact as to whether Motherless, when it infringed on Ventura’s copyrighted material, had adopted and reasonably implemented its policy of terminating repeat infringers in appropriate circumstances. No trier of fact could conclude from the evidence in the record that Motherless had failed to reasonably implement a repeat infringer policy.
As the district court pointed out, there is a paucity of proven failures to terminate. Safe harbor eligibility does not require perfection, just reasonable implementation of the policy in appropriate circumstances. Eligibility for the safe harbor is not lost just because some repeat infringers may have slipped through the provider’s net for screening them out and terminating their access. The evidence in the record shows that Motherless terminated between 1,320 and 1,980 users for alleged copyright infringement and that only nine alleged repeat infringers had slipped through. Of those nine, only six were before Ventura filed its lawsuit, and only four of the six had been the subject of more than one DMCA notice. That suggests that less than one repeat infringer in 100,000 active users was missed. If that is the extent of failure, there could be no genuine issue of material fact as to whether Motherless “reasonably implemented” its termination policy. Congress used the word “reasonable” to modify “implemented,” so the phrase cannot be construed to require perfect implementation.
The absence of any significant number of repeat infringers who escaped termination compels the conclusion that a trier of fact could not conclude, on the record before us, that Motherless failed to meet the repeat infringer eligibility requirement for safe harbor. Motherless and Lange are therefore entitled to claim the protection of the safe harbor.
RAWLINSON, Circuit Judge, dissenting:
I respectfully dissent from my colleagues’ conclusion that Motherless, Inc. and Joshua Lange qualified for the safe harbor provided for in the Digital Millennium Copyright Act (the Act).
It is important to remember that this case was resolved on summary judgment. Therefore, if a material issue of fact was raised by Ventura Content, Ltd. (Ventura), entry of summary judgment in favor of Motherless, Inc. and Lange was in error. From my reading of the record, a gargantuan issue of fact was raised by Ventura regarding Motherless’/Lange’s compliance with the requirement that the service provider adopt, implement, and inform subscribers and account holders of the policy providing for termination of repeat infringers to merit safe harbor protection from copyright infringement.
…
Viewing the evidence in the light most favorable to Ventura, material issues of fact remain regarding the existence of a policy as defined in the Act, and the reasonableness of actions taken by Motherless/Lange to terminate repeat infringers. I would reverse that portion of the district court’s ruling, and I respectfully dissent from the majority’s contrary ruling.
__________
Check Your Understanding – Ventura Content
Question 1. According to the plaintiff in Ventura Content, why did the fact that the defendant screened for child pornography deprive the website of the DMCA safe harbor?
Question 2. How did the court distinguish between the facts of Ventura and Fung?
Some things to consider when reading BMG Rts. Mgmt.:
- Excerpts from this 2018 decision appeared earlier in this casebook in the section on indirect infringement.
- The excerpts from the decision reproduced below address the § 512(a) safe harbor (as opposed to the § 512(c) safe harbor), which can be available for a service provider that acts as a “conduit” for the transmission and/or transient storage of infringing material, such as a provider of high-speed Internet access like Cox Communications.
- In the language of the statute, § 512(a) applies to “infringement of copyright by reason of the provider’s transmitting, routing, or providing connections for, material through a system or network controlled or operated by or for the service provider, or by reason of the intermediate and transient storage of that material in the course of such transmitting, routing, or providing connections.”
- What do you think about Cox’s “thirteen-strike policy” as a deterrent to copyright infringement?
- Note that the problem for Cox was not its repeat infringer policy per se, but its failure to actually follow through on it.
- Be sure you understand that failure to qualify for a DMCA safe harbor does not result in infringement liability; it just removes the shield and allows the plaintiff to present evidence to prove infringement. That is what happened in this case—see Sony Music Ent. v. Cox Commc’ns, Inc., which appears in the indirect infringement section of this casebook.
BMG Rts. Mgmt. (US) LLC v. Cox Commc’ns, Inc.
881 F.3d 293 (4th Cir. 2018)
DIANA GRIBBON MOTZ, Circuit Judge:
BMG Rights Management (US) LLC (“BMG”), which owns copyrights in musical compositions, filed this suit alleging copyright infringement against Cox Communications, Inc. and CoxCom, LLC (collectively, “Cox”), providers of high-speed Internet access. BMG seeks to hold Cox contributorily liable for infringement of BMG’s copyrights by subscribers to Cox’s Internet service. Following extensive discovery, the district court held that Cox had not produced evidence that it had implemented a policy entitling it to a statutory safe harbor defense and so granted summary judgment on that issue to BMG. Cox appeals, asserting that the district court erred in denying it the safe harbor defense. We hold that Cox is not entitled to the safe harbor defense and affirm the district court’s denial of it.
I.
A.
Cox is a conduit Internet service provider (“ISP”), providing approximately 4.5 million subscribers with high-speed Internet access for a monthly fee. Some of Cox’s subscribers shared and received copyrighted files, including music files, using a technology known as BitTorrent. BitTorrent is not a software program, but rather describes a protocol—a set of rules governing the communication between computers—that allows individual computers on the Internet to transfer files directly to other computers. This method of file sharing is commonly known as “peer-to-peer” file sharing, and contrasts with the traditional method of downloading a file from a central server using a Web browser.
As a conduit ISP, Cox only provides Internet access to its subscribers. Cox does not create or sell software that operates using the BitTorrent protocol, store copyright-infringing material on its own computer servers, or control what its subscribers store on their personal computers.
Cox’s agreement with its subscribers reserves the right to suspend or terminate subscribers who use Cox’s service “to post, copy, transmit, or disseminate any content that infringes the patents, copyrights … or proprietary rights of any party.” To enforce that agreement and protect itself from liability, however, Cox created only a very limited automated system to process notifications of alleged infringement received from copyright owners. Cox’s automated system rests on a thirteen-strike policy that determines the action to be taken based on how many notices Cox has previously received regarding infringement by a particular subscriber. The first notice alleging a subscriber’s infringement produces no action from Cox. The second through seventh notices result in warning emails from Cox to the subscriber. After the eighth and ninth notices, Cox limits the subscriber’s Internet access to a single webpage that contains a warning, but the subscriber can reactivate complete service by clicking an acknowledgement. After the tenth and eleventh notices, Cox suspends services, requiring the subscriber to call a technician, who, after explaining the reason for suspension and advising removal of infringing content, reactivates service. After the twelfth notice, the subscriber is suspended and directed to a specialized technician, who, after another warning to cease infringing conduct, reactivates service. After the thirteenth notice, the subscriber is again suspended, and, for the first time, considered for termination. Cox never automatically terminates a subscriber.
The effectiveness of Cox’s thirteen-strike policy as a deterrent to copyright infringement has several additional limitations. Cox restricts the number of notices it will process from any copyright holder or agent in one day; any notice received after this limit has been met does not count in Cox’s graduated response escalation. Cox also counts only one notice per subscriber per day. And Cox resets a subscriber’s thirteen-strike counter every six months.
BMG, a music publishing company, owns copyrights in musical compositions. To protect this copyrighted material, BMG hired Rightscorp, Inc., which monitors BitTorrent activity to determine when infringers share its clients’ copyrighted works. When Rightscorp identifies such sharing, it emails an infringement notice to the alleged infringer’s ISP (here, Cox). The notice contains the name of the copyright owner (here, BMG), the title of the copyrighted work, the alleged infringer’s IP address, a time stamp, and a statement under penalty of perjury that Rightscorp is an authorized agent and the notice is accurate.
Rightscorp also asks the ISP to forward the notice to the allegedly infringing subscriber, since only the ISP can match the IP address to the subscriber’s identity. For that purpose, the notice contains a settlement offer, allowing the alleged infringer to pay twenty or thirty dollars for a release from liability for the instance of infringement alleged in the notice. Cox has determined to refuse to forward or process notices that contain such settlement language. When Cox began receiving Rightscorp notices in the spring of 2011 (before Rightscorp had signed BMG as a client), Cox notified Rightscorp that it would process the notices only if Rightscorp removed the settlement language. Rightscorp did not do so. Cox never considered removing the settlement language itself or using other means to inform its subscribers of the allegedly infringing activity observed by Rightscorp.
Rightscorp continued to send Cox large numbers of settlement notices. In the fall of 2011, Cox decided to “blacklist” Rightscorp, meaning Cox would delete notices received from Rightscorp without acting on them or even viewing them. BMG hired Rightscorp in December 2011—after Cox blacklisted Rightscorp. Thus, Cox did not ever view a single one of the millions of notices that Rightscorp sent to Cox on BMG’s behalf.
B.
On November 26, 2014, BMG initiated this action against Cox. BMG alleged that Cox was vicariously and contributorily liable for acts of copyright infringement by its subscribers.
At the conclusion of discovery, the parties filed multi-issue cross-motions for summary judgment, which the district court resolved in a careful written opinion. Among these issues, BMG asserted that Cox had not established a policy entitling it to the safe harbor defense contained in the Digital Millennium Copyright Act (“DMCA”), 17 U.S.C. § 512(a). The court granted summary judgment to BMG on Cox’s safe harbor defense.
II.
We first address Cox’s contention that the district court erred in denying it the § 512(a) DMCA safe harbor defense.
A.
The DMCA provides a series of safe harbors that limit the copyright infringement liability of an ISP and related entities. As a conduit ISP, Cox seeks the benefit of the safe harbor contained in 17 U.S.C. § 512(a). To fall within that safe harbor, Cox must show that it meets the threshold requirement, common to all § 512 safe harbors, that it has “adopted and reasonably implemented … a policy that provides for the termination in appropriate circumstances of subscribers … who are repeat infringers.” 17 U.S.C. § 512(i)(1)(A).
Cox’s principal contention is that “repeat infringers” means adjudicated repeat infringers: people who have been held liable by a court for multiple instances of copyright infringement. Cox asserts that it complied with § 512(i)(1)(A)’s requirement and is therefore entitled to the § 512(a) DMCA safe harbor because BMG did not show that Cox failed to terminate any adjudicated infringers. BMG responds that Cox’s interpretation of “repeat infringers” is contrary to “the DMCA’s plain terms.”
[Editor’s note: The court rejected Cox’s argument that the term “repeat infringers” in § 512(i) is limited to adjudicated infringers.]
B.
Section 512(i) thus requires that, to obtain the benefit of the DMCA safe harbor, Cox must have reasonably implemented “a policy that provides for the termination in appropriate circumstances” of its subscribers who repeatedly infringe copyrights. 17 U.S.C. § 512(i)(1)(A). We are mindful of the need to afford ISPs flexibility in crafting repeat infringer policies, and of the difficulty of determining when it is “appropriate” to terminate a person’s access to the Internet. At a minimum, however, an ISP has not “reasonably implemented” a repeat infringer policy if the ISP fails to enforce the terms of its policy in any meaningful fashion. See In re Aimster Copyright Litig., 252 F.Supp.2d 634, 659 (N.D. Ill. 2002), aff’d, 334 F.3d 643 (7th Cir. 2003) (“Adopting a repeat infringer policy and then purposely eviscerating any hope that such a policy could ever be carried out is not an ‘implementation’ as required by § 512(i).”). Here, Cox formally adopted a repeat infringer “policy,” but, both before and after September 2012, made every effort to avoid reasonably implementing that policy. Indeed, in carrying out its thirteen-strike process, Cox very clearly determined not to terminate subscribers who in fact repeatedly violated the policy.
The words of Cox’s own employees confirm this conclusion. In a 2009 email, Jason Zabek, the executive managing the Abuse Group, a team tasked with addressing subscribers’ violations of Cox’s policies, explained to his team that “if a customer is terminated for DMCA, you are able to reactivate them,” and that “[a]fter you reactivate them the DMCA ‘counter’ restarts.” The email continued, “This is to be an unwritten semi-policy.” Zabek also advised a customer service representative asking whether she could reactivate a terminated subscriber that “[i]f it is for DMCA you can go ahead and reactivate.” Zabek explained to another representative: “Once the customer has been terminated for DMCA, we have fulfilled the obligation of the DMCA safe harbor and can start over.” He elaborated that this would allow Cox to “collect a few extra weeks of payments for their account. ;-).” Another email summarized Cox’s practice more succinctly: “DMCA = reactivate.” As a result of this practice, from the beginning of the litigated time period until September 2012, Cox never terminated a subscriber for infringement without reactivating them.
Cox nonetheless contends that it lacked “actual knowledge” of its subscribers’ infringement and therefore did not have to terminate them. That argument misses the mark. The evidence shows that Cox always reactivated subscribers after termination, regardless of its knowledge of the subscriber’s infringement. Cox did not, for example, advise employees not to reactivate a subscriber if the employees had reliable information regarding the subscriber’s repeat infringement. An ISP cannot claim the protections of the DMCA safe harbor provisions merely by terminating customers as a symbolic gesture before indiscriminately reactivating them within a short timeframe.
In September 2012, Cox abandoned its practice of routine reactivation. An internal email advised a new customer service representative that “we now terminate, for real.” BMG argues, however, that this was a change in form rather than substance, because instead of terminating and then reactivating subscribers, Cox simply stopped terminating them in the first place. The record evidence supports this view. Before September 2012, Cox was terminating (and reactivating) 15.5 subscribers per month on average; after September 2012, Cox abruptly began terminating less than one subscriber per month on average. From September 2012 until the end of October 2014, the month before BMG filed suit, Cox issued only 21 terminations in total. Moreover, at least 17 of those 21 terminations concerned subscribers who had either failed to pay their bills on time or used excessive bandwidth (something that Cox subjected to a strict three-strike termination policy). Cox did not provide evidence that the remaining four terminations were for repeat copyright infringement. But even assuming they were, they stand in stark contrast to the over 500,000 email warnings and temporary suspensions Cox issued to alleged infringers during the same time period.
Moreover, Cox dispensed with terminating subscribers who repeatedly infringed BMG’s copyrights in particular when it decided to delete automatically all infringement notices received from BMG’s agent, Rightscorp. As a result, Cox received none of the millions of infringement notices that Rightscorp sent to Cox on BMG’s behalf during the relevant period. Although our inquiry concerns Cox’s policy toward all of its repeatedly infringing subscribers, not just those who infringed BMG’s copyrights, Cox’s decision to categorically disregard all notices from Rightscorp provides further evidence that Cox did not reasonably implement a repeat infringer policy.
BMG also provided evidence of particular instances in which Cox failed to terminate subscribers whom Cox employees regarded as repeat infringers. For example, one subscriber “was advised to stop sharing … and remove his PTP programs,” and a Cox employee noted that the subscriber was “well aware of his actions” and was “upset that ‘after years of doing this’ he is now getting caught.” Nonetheless, Cox did not terminate the subscriber. Another customer was advised that “further complaints would result in termination” and that it was the customer’s “absolute last chance to … remove ALL” file-sharing software. But when Cox received another complaint, a manager directed the employee not to terminate, but rather to “suspend this Customer, one LAST time,” noting that “[t]his customer pays us over $400/month” and that “[e]very terminated Customer becomes lost revenue.”
Cox responds that these post-September 2012 emails do not necessarily “prove actual knowledge of repeat infringement.” Again, that argument is misplaced. Cox bears the burden of proof on the DMCA safe harbor defense; thus, Cox had to point to evidence showing that it reasonably implemented a repeat infringer policy. The emails show that Cox internally concluded that a subscriber should be terminated after the next strike, but then declined to do so because it did not want to lose revenue. In other words, Cox failed to follow through on its own policy. Cox argues that these emails only concerned “four cases,” and that “occasional lapses” are forgivable. But even four cases are significant when measured against Cox’s equally small total number of relevant terminations in this period—also four. More importantly, Cox did not produce any evidence of instances in which it did follow through on its policy and terminate subscribers after giving them a final warning to stop infringing.
In addition, Cox suggests that because the DMCA merely requires termination of repeat infringers in “appropriate circumstances,” Cox decided not to terminate certain subscribers only when “appropriate circumstances” were lacking. But Cox failed to provide evidence that a determination of “appropriate circumstances” played any role in its decisions to terminate (or not to terminate). Cox did not, for example, point to any criteria that its employees used to determine whether “appropriate circumstances” for termination existed. Instead, the evidence shows that Cox’s decisions not to terminate had nothing to do with “appropriate circumstances” but instead were based on one goal: not losing revenue from paying subscribers.
Cox failed to qualify for the DMCA safe harbor because it failed to implement its policy in any consistent or meaningful way—leaving it essentially with no policy. Accordingly, the district court did not err in holding that Cox failed to offer evidence supporting its entitlement to the § 512(a) safe harbor defense and therefore granting summary judgment on this issue to BMG.
__________
Check Your Understanding – BMG Rts. Mgmt.
Question 1. Why was Cox denied the benefit of the DMCA safe harbor?
FOOTNOTES:
1 Sometimes referred to as online service providers (“OSPs”).
2 As discussed in Skidmore as Tr. for Randy Craig Wolfe Tr. v. Led Zeppelin, a decision that appears earlier in this casebook, sound recordings did not become subject to federal copyright protection until 1972, and then only for the sound recordings fixed on or after February 15, 1972. 17 U.S.C. § 301(c).
3 676 F.3d 19 (2d Cir. 2012).
4 718 F.3d 1006 (9th Cir. 2013).
5 873 F.3d 1045, 1052–57 (9th Cir. 2017).
6 676 F.3d 19, 39 (2d Cir. 2012).
7 UMG Recordings, Inc. v. Shelter Capital Partners, LLC, 718 F.3d 1006, 1020 (9th Cir. 2013) (quoting Corbis Corp. v. Amazon.com, Inc., 351 F.Supp.2d 1090, 1107 (W.D. Wash. 2004) ) (citing Io Grp., Inc. v. Veoh Networks, Inc., 586 F.Supp.2d 1132, 1148 (N.D. Cal. 2008) ); see also 4 MELVILLE B. NIMMER & DAVID NIMMER, NIMMER ON COPYRIGHT § 12B.04[A][3], at 12B-94 (rev. ed. 2017) (“NIMMER”).
8 See UMG Recordings, Inc. v. Shelter Capital Partners LLC, 718 F.3d 1006, 1021–22 (9th Cir. 2013); Capitol Records, LLC v. Vimeo, LLC, 826 F.3d 78, 93 (2d Cir. 2016); Viacom Int’l, Inc. v. YouTube, Inc., 676 F.3d 19, 31 (2d Cir. 2012).