Tag Archives: DMCA

Must a service provider remove all content a repeat infringer uploaded to qualify for the DMCA safe harbor?

459px-Enjoy_Don't_DestroyDoes an online service provider forfeit the safe harbor protections of the Digital Millennium Copyright Act if, when terminating the account of a repeat infringer, it does not delete all content the repeat infringer uploaded — infringing and noninfringing alike? A recent decision involving the antique internet technology Usenet sheds light on an answer.

Active copyright plaintiff Perfect 10 sued Usenet provider Giganews for direct and secondary liability for hosting allegedly infringing materials on the Giganews servers. Giganews asserted the safe harbor of the DMCA (17 U.S.C. 512) as an affirmative defense. Perfect 10 moved for summary judgment on whether the safe harbor applied – it argued that the safe harbor did not apply, Giganews argued that it did. The court denied Perfect 10’s motion.

Perfect 10 asserted that Giganews had not reasonably implemented a policy to terminate the accounts of repeat infringers as required by 17 U.S.C. 512 (i)(1)(A). One of the arguments Perfect 10 made was that Giganews did not reasonably implement its repeat infringer policy because Giganews terminated the accounts of the infringers but did not also delete all the content the infringers had uploaded.

The court was not persuaded that § 512(i)(1)(A) requires a service provider to disable or delete all content a repeat infringer has ever posted. The plain language of the statute requires “termination … of subscribers and account holders,” not the deletion of content. And because a requirement of taking down all content, not just infringing content, would serve no infringement-preventing purpose, the court held that there was no justification for reading such a requirement into the statute.

Perfect 10, Inc. v. Giganews, Inc., — F.Supp.2d —, 2014 WL 323655 (C.D.Cal. January 29, 2014)

Court refuses to help author who was victim of alleged bogus DMCA takedown notices

Author and her publisher disagreed on the content of two of the author’s new books. (As an aside, this author is very prolific — she alleges that she publishes a new book every two weeks!) So rather than deal with publisher, author self-published the works on Amazon. Publisher sent DMCA takedown notices to Amazon, with which Amazon complied. Author sued publisher under Section 512(f) of the DMCA, which provides penalties against senders of DMCA takedown notices that knowingly materially misrepresent claims of infringement. She sought a temporary restraining order (TRO), asking the court to instruct publisher to tell Amazon to make the works available.

The court denied the TRO motion. It found that author had failed to show she would suffer irreparable harm if the works were not put back on the market. In the court’s view, author failed to show how a temporary delay in sales would affect her reputation or goodwill.

The case presents an interesting issue concerning a party’s right to send a DMCA takedown notice. Author alleges that her agreement with publisher provided she owns the copyright in her works, and that publisher merely has a right of first refusal to publish any “sequels” to her previous works. So if what author is saying is true, that publisher does not have a copyright interest in the books but merely a contract interest, she stands a good chance, ultimately, on her 512(f) claim.

Flynn v. Siren-BookStrand, Inc., 2013 WL 5315959 (September 20, 2013)

Court rules against Ripoff Report in copyright case

Xcentric Ventures, LLC v. Mediolex Ltd., 2012 WL 5269403 (D.Ariz. October 24, 2012)

Plaintiff Xcentric Ventures provides the infamous Ripoff Report, a website where consumers can go to defame complain about businesses they have dealt with. Defendant ComplaintsBoard.com is a similar kind of website.

Ripoff Report’s Terms of Service provide that users grant Ripoff Report an exclusive license in the content they post to the site. Based on this right, Xcentric sued various defendants associated with ComplaintsBoard for “encourag[ing] and permit[ing] consumers to post content that has been exclusively licensed to Xcentric.”

Defendants moved to dismiss the copyright infringement claim, asserting they were protected by the safe harbor provision of the Digital Millennium Copyright Act (“DMCA”). The court granted the motion to dismiss, but not because of the DMCA.

DMCA Analysis

The safe harbor provision of the DMCA states that a “service provider shall not be liable for monetary relief” if all of the following requirements are met:

(1) it does not have actual knowledge that the material on its network is infringing;

(2) it is not aware of facts or circumstances that would make the infringing activity apparent;and

(3) upon obtaining knowledge or awareness of such infringing activity, it acts expeditiously to remove or disable access to the copyrighted material.

In this case, Xcentric alleged that defendants actively “encouraged and permitted” copyright infringement by ComplaintsBoard users. The court held that this allegation, if taken as true, could be sufficient to preclude defendants from taking advantage of the DMCA’s safe harbor provisions.

But the court went on to hold that Xcentric had failed to state a copyright claim on which relief may be granted.

Secondary Liability Insufficiently Pled

Xcentric did not allege that defendants directly infringed copyright. Instead, it alleged that by encouraging and permitting users to copy and republish material, ComplaintsBoard was engaged in secondary infringement — either vicarious or contributory infringement.

To state a claim for contributory copyright infringement, Xcentric had to plead that ComplaintsBoard had knowledge of the infringing activity and induced, caused, or materially contributed to the infringing conduct of its users. The court found that Xcentric had not alleged any facts that would lead to a reasonable inference that defendants knew of their users’ republishing Xcentric’s copyrighted content or that defendants had induced, caused, or materially contributed to such republication.

To successfully plead vicarious infringement, Xcentric had to show that defendants had the right and ability to supervise the infringing activity and also had a direct financial interest in those activities. The court found that Xcentric had not put forward enought facts to show that defendants had the right and ability to supervise the infringing activity.

DMCA takedown notices are not just for content

Apple using the DMCA to stop early sales of iOS 6.

The infamous Digital Millennium Copyright Act takedown process gets quite a bit of press when content owners such as movie studios and record companies use it to take infringing copies of films or music offline. The safe harbor provisions of the DMCA are at the heart of content-distribution platforms’ defenses against infringement occasioned by users of the platform. (Think Viacom v. YouTube.)

Apple reminds us, however, that the DMCA gives all copyright owners — not just those who own copyrights in content – a mechanism for getting infringing works off the internet. According to this piece on Engadget, Apple has been contacting hosting providers of sites that offer unauthorized copies of the forthcoming iOS 6 for sale.

So the DMCA, acting in the name of copyright protection, provides a remedy for software providers to keep the clamps on parties who may have access to software for their own use (in this case, iOS developers) but go outside the bounds of such use and offer the technology for sale to others.

Social media legal best practices: some problems and solutions with uploading photos and tagging people

Facebook, Flickr, 500px and mobile sharing applications such as Instagram have replaced the hard copy photo album as the preferred method for letting others see pictures you have taken. Now photos are easy to take and easy to share. This easiness makes a number of legal questions potentially more relevant.

Embarrassing photos

Let’s be honest — every one of us has been in photos that we do not want others to see. It may be just bad lighting, an awkward angle, or something more sinister such as nudity or drug use, but having a photo like that made public would cause embarrassment or some other type of harm. Sometimes the law affords ways to get embarrassing photos taken down. To the same extent the law can help, the one posting the embarrassing photo puts himself at risk for legal liability.

Invasion of privacy. If someone takes a picture of another in a public place, or with a bunch of other people, the subject of the photo probably does not have a right of privacy in whatever he or she is doing in the photo. So the law will not be helpful in getting that content off the internet. But there are plenty of situations where the subject of a photo may have a privacy interest that the law will recognize.

  • “Intrusion upon seclusion,” as its name suggests, is a legal claim that one can make when someone has intentionally intruded — physically or otherwise — upon their solitude or seclusion. Surreptitiously taken photos of a person in her own home, or in a place where she expected privacy (e.g., in a hotel room or dressing room) would likely give rise to an unlawful intrusion upon seclusion.
  • “Publication of private facts” is another form of invasion of privacy. A person commits this kind of invasion of privacy by publishing private, non-newsworthy facts about another person in a way that would be offensive to a reasonable person. Posting photos of one’s ex-girlfriend engaged in group sex would be considered publication of private facts. Posting family pics of one’s nephew when he was a kid would not.

Photoshop jobs

Some people enjoy using Photoshop or a similar advanced photo editing application to paste the head of one person onto the body of another. (Reddit has an entire category devoted to Photoshop requests.) This can have drastic, negative consequences on the person who — through this editing — appears to be in the photo doing something he or she did not and would not do. This conduct might give rise to legal claims of infliction of emotional distress and defamation.

Infliction of emotional distress. We expect our fellow members of society to be somewhat thick-skinned, and courts generally do not allow lawsuits over hurt feelings. But when it’s really bad, the law may step in to help. One may recover for infliction of emotional distress (sometimes called “outrage”) against another person who acts intentionally, and in a way that is extreme and outrageous, to cause emotional distress that is severe. Some states require there to be some associated physical harm. A bride who sued her photographer over the emotional distress she suffered when the photographer posted pictures of her in her underwear lost her case because she alleged no fear that she was exposed to physical harm.

Defamation. A person can sue another for defamation over any “published” false statement that harms the person’s reputation. Some forms of defamation are particularly bad (they are called defamation per se), and are proven when, for example, someone falsely states that a person has committed a crime, has engaged in sexually immoral behavior, or has a loathsome disease. A realistic Photoshop job could effectively communicate a false statement about someone that is harmful to his or her reputation.

Copied photos

Since copying and reposting images is so easy, a lot of people do it. On social media platforms, users often do not mind if a friend copies the photos from last night’s dinner party and reuploads them to another account. In situations like these, it’s simply “no harm, no foul.” Technically there is copyright infringement going on, but what friend is going to file a lawsuit against another friend over this socially-acceptable use? The more nefarious situations illustrate how copyright can be used to control the display and distribution of photos.

In most instances, the person who takes a photo owns the copyright in that photo. A lot of people believe that if you appear in a photo, you own the copyright. That’s not true unless the photo is a self portrait (e.g., camera held at arm’s length and turned back toward the person, or shot into the mirror), or unless the person in the photo has otherwise gotten ownership of the copyright through a written assignment (a much rarer situation).

A person who finds that his or her copyrighted photos have been copied and reposted without permission has a number of options available. In the United States, a quick remedy is available under the notice and takedown provisions of the Digital Millennium Copyright Act. The copyright owner sends a notice to the platform hosting the photos and demands the photos be taken down. The platform has an incentive to comply with that demand, because if it does, it cannot be held responsible for the infringement. Usually a DMCA takedown notice is sufficient to solve the problem. But occasionally one must escalate the dispute into copyright infringement litigation.

Some other things to keep in mind

With the protections afforded by free speech and the difficulties involved in winning an invasion of privacy or infliction of emotional distress lawsuit, one can get away with quite a bit when using photos in social media. One court has even observed that you do not need a person’s permission before tagging him or her in a photo.

A person offended by the use of a photo by him or of him may have recourse even in those situations where it is not so egregious as to give him a right to sue. Social media platforms have terms of service that prohibit users from harassing others, imitating others, or otherwise engaging in harmful conduct. The site will likely remove content once it is made aware of it. (I have sent many requests to Facebook’s legal department requesting content to be removed — and it has been removed.) The norms of social media communities play an important role in governing how users treat one another, and that principle extends to the notions of civility as played out through the use of photos online.

Evan Brown is a Chicago technology and intellectual property attorney. He advises businesses and individuals in a wide range of situations, including social media best practices. 

Photo credit: AForestFrolic

Fair use, the DMCA, and presidential politics

The 2012 presidential election cycle is already giving internet law enthusiasts things to talk about. Last week it was Ron Paul’s grumblings about an unauthorized campaign ad on YouTube. Now NBC is moaning about a Mitt Romney ad comprised almost entirely of Tom Brokaw on the Nightly News in 1997.

NBC has asked the ad be pulled, claiming it is a copyright infringement. Smart people are already saying the ad is fair use. It probably is fair use.

And NBC knows that. Romney’s campaign posted the ad on YouTube five days ago, and it is yet to be the subject of a DMCA takedown notice. Though such a notice would be easy to draft and send, NBC is aware that the fallout could be expensive. Section 512(f) of the DMCA penalizes the senders of bogus takedown notices. And the courts have not taken kindly to purported victims of infringement who do not fully consider fair use before having content taken off YouTube.

With the election still months away, we may yet see controversial action like we did in 2008 by the news media to disable political content. These situations underscore the problem presented by how long it takes to process DMCA counternotifications and 512(f) actions.

A candidate’s defeat makes these processes moot. So maybe we should hope for a longer republican primary season just so we can see some good DMCA and fair use litigation. Come on NBC, send that takedown notice!

YouTube victorious in copyright case brought by Viacom

District court grants summary judgment, finding YouTube protected by DMCA safe harbor.

Viacom v. YouTube, No. 07-2103, (S.D.N.Y. June 23, 2010)

The question of whether and to what extent a website operator should be liable for the copyright infringement occasioned by the content uploaded by the site’s users is one of the central problems of internet law. In talks I’ve given on this topic of “secondary liability,” I’ve often referred it simply as “the YouTube problem”: should YouTube be liable for the infringing content people upload, especially when it knows that there is infringing material.

Charlie Bit My Finger - Harry and his little b...
Image via Wikipedia

Today was a big day in the history of that problem. The district court granted summary judgment in favor of YouTube in the notorious billion dollar copyright lawsuit brought against YouTube by Viacom way back in 2007.

The court held that the safe harbor provisions of the Digital Millennium Copyright Act (“DMCA”) (at 17 USC 512) protected YouTube from Viacom’s direct and secondary copyright claims.

Simply stated, the DMCA protects online service providers from liability for copyright infringement arising from content uploaded by end users if a number of conditions are met. Among those conditions are that the service provider “not have actual knowledge that the material or an activity using the material on the system or network is infringing,” or in the absence of such actual knowledge, “is not aware of facts or circumstances from which infringing activity is apparent.”

The major issue in the case was whether YouTube met these conditions of “non-knowledge” (that’s my term, not the court’s) so that it could be in the DMCA safe harbor. Viacom argued that the infringement was so pervasive on YouTube that the site should have been aware of the infringement and thus not in the safe harbor. YouTube of course argued otherwise.

The court sided with YouTube :

Mere knowledge of prevalence of such activity in general is not enough. . . . To let knowledge of a generalized practice of infringement in the industry, or of a proclivity of users to post infringing materials, impose responsibility on service providers to discover which of their users’ postings infringe a copyright would contravene the structure and operation of the DMCA.

Given the magnitude of the case, there’s little doubt this isn’t the end of the story — we’ll almost certainly see the case appealed to the Second Circuit Court of Appeals. Stay tuned.

Enhanced by Zemanta

Should ISPs get paid to respond to DMCA takedown notices?

CNET News is running a story about how Jerry Scroggin, the owner of Louisiana’s Bayou Internet and Communications, expects big media to pay him for complying with DMCA takedown notices. No doubt Scroggin gets a little PR boost for his maverick attitude, and CNET keeps its traffic up by covering a provocative topic. After all, people love to see the little guy stick it to the man.

Here is something from the article that caught my attention:

Small companies like [Bayou] are innocent bystanders in the music industry’s war on copyright infringement. Nonetheless, they are asked to help enforce copyright law free of charge.

A couple of assumptions in this statement need addressing. I submit that:

ISPs are not innocent bystanders.

As much as one may disdain the RIAA, the organization is enforcing legitimate copyright rights. Though an ISP may have no bad intent to help people infringe (i.e., the “innocent” part), infringing content does pass through their systems. And few would disagree that the owner of a system is in the best position to control what happens in that system. So unless we’re going to turn the entire network over to a government, we must rely on the ISPs at the lower parts of the web to comply with the DMCA. They owe a duty. It’s in this way that the ISPs are anything by innocent bystanders in the copyright wars. In fact, they’re soldiers (albeit perhaps drafted).

Though the administrative burdens of DMCA compliance fall on the ISPs, the work is not undertaken for free.

The safe harbor that ISPs enjoy in return for compliance is a huge compensation. An entity in the safe harbor has more certainty that a suit for infringement would be unsuccessful. Were there more doubt about the outcome, there would be more litigation. More litigation equals more cost. And I guarantee you that those litigation costs would dwarf the administrative costs associated with taking down content identified in a notice. So substract the administrative costs from the hypothetical litigation costs, and there you have the compensation paid to ISPs for compliance.

What do you think?

Pirate Christmas photo courtesy Flickr user Ross_Angus under this Creative Commons license.

DMCA reaches the decade mark

My friend Kevin Thompson over at Cyberlaw Central reminded me this morning in this post that President Clinton signed the Digital Millennium Copyright Act ten years ago today. Tempus fugit. It’s interesting to reflect on how this critical piece of legislation has affected (I think fostered) the growth of the online infrastructure with its safe harbor provisions found at 17 U.S.C. 512.

DMCA at 10 years

Simply stated, the DMCA at section 512 gives safe harbor protections to providers of interactive computer services (like ISPs and websites hosting user generated content) from liability when users upload content that infringes on another’s copyright rights. To sail its ship into the safe harbor, the provider has to take certain affirmative steps, like registering an agent with the Copyright Office, terminating the accounts of repeat infringers, and, most importantly, responding appropriately to “takedown notices” sent by copyright owners identifying infringing content on the provider’s system.

Though few could disagree with the principle of protecting service providers from infringement liability occasioned by the conduct of third party users (i.e., stemming from user generated content), the DMCA has its critics. And the actual mechanism has some bugs.

A big factor in the problem is the sheer volume of user generated content that’s put online. How can an operator like YouTube, who gets hours of new content loaded to its servers every minute, reasonably be expected to give meaningful review to every takedown notice that comes its way? It can’t.

So for practical reasons, big providers (and smaller ones alike) take down accused content essentially with a rubber stamp. And who can blame them? It saves administrative time and helps ensure safe harbor protection. But there are negative consequences to users and to the public. These consequences on the First Amendment and other rights are well-exemplified by the recent correspondence between the McCain-Palin campaign and YouTube, with amicus-like voices joining the chorus.

Like any ten-year old, the DMCA shows signs of maturity. It has withstood a decade of scrutiny, all the while giving service providers peace of mind, along with relatively efficient mechanisms for copyright owners to get infringing material taken down quickly. But also like a ten-year-old, the challenging years of adolescence — and the accompanying rudimentary changes — are around the corner. It’ll still be the DMCA, but I wouldn’t be surprised to see some transformation going on as user generated content becomes less a novelty and more a standard.

Birthday cake photo courtesy of Flickr user “juverna” via this Creative Commons license.

Veoh eligible for DMCA Safe Harbor

[Brian Beckham is a contributor to Internet Cases and can be contacted at brian.beckham [at] gmail dot com.]

Io Group, Inc. v. Veoh Networks, Inc., 2008 WL 4065872 (N.D.Cal. Aug. 27, 2008)

The U.S. District Court for the Northern District of California ruled that Veoh’s hosting of user-provided content is protected by the DMCA safe harbor provision, and that it does not have a duty to police for potential copyright infringement on behalf of third-parties, but rather must act to remove infringing content when so put on notice.

IO produces adult films; Veoh hosts, inter alia, its own “Internet TV channels” and user-posted content (much like YouTube). In June 2006, IO discovered clips from ten (10) of its copyrighted films ranging from 6 seconds to 40 minutes in length hosted on Veoh. Rather than sending Veoh a “DMCA Notice & Takedown” letter, IO filed the instant copyright infringement suit. (Coincidentally, Veoh had already removed all adult content sua sponte — including IO’s prior to the suit). Had Veoh received such a notice, so the story goes, it would have removed the content, and terminated the posting individual’s account.

When a user submits a video for posting, Veoh’s system extracts certain metadata (e.g., file format and length), assigns a file number, extracts several still images (seen on the site as an icon), and converts the video to Flash. Prior to posting, Veoh’s employees randomly spot check the videos for compliance with Veoh’s policies (i.e., that the content is not infringing third-party copyrights). On at least one occasion, such a spot check revealed infringing content (an unreleased movie) which was not posted.

Veoh moved for summary judgment under the DMCA’s Safe Harbors which “provide protection from liability for: (1) transitory digital network communications; (2) system caching; (3) information residing on systems or networks at the direction of users; and (4) information location tools.” Ellison, 357 F.3d at 1076-77. Finding that Veoh is a Service Provider under the DMCA, the Court had little trouble in finding that it qualified for the Safe Harbors. IO admitted that Veoh “(a) has adopted and informed account holders of its repeat infringer policy and (b) accommodates, and does not interfere with, “standard technical measures” used to protect copyrighted works”, but took issue with the manner in which Veoh implemented its repeat infringer policy.

Veoh clearly established that it had a functioning DMCA Notice & Takedown system:

  • Veoh has identified its designated Copyright Agent to receive notification of claimed violations and included information about how and where to send notices of claimed infringement.
  • Veoh often responds to infringement notices the same day they are received.
  • When Veoh receives notice of infringement, after a first warning, the account is terminated and all content provided by that user disabled.
  • Veoh terminates access to other identical infringing files and permanently blocks them from being uploaded again.
  • Veoh has terminated over 1,000 users for copyright infringement.

The Court held that Veoh did not have a duty to investigate whether terminated users were re-appearing under pseudonyms, but that as long as it continued to effectively address alleged infringements, it continued to qualify for the DMCA Safe Harbors; moreover, it did not have to track users’ IP addresses to readily identify possibly fraudulent new user accounts.

The Court further noted that: “In essence, a service provider [Veoh] is eligible for safe harbor under section 512(c) if it (1) does not know of infringement; or (2) acts expeditiously to remove or disable access to the material when it (a) has actual knowledge, (b) is aware of facts or circumstances from which infringing activity is apparent, or (c) has received DMCA-compliant notice; and (3) either does not have the right and ability to control the infringing activity, or – if it does – that it does not receive a financial benefit directly attributable to the infringing activity.”

The Court found that (1) there was no question that Veoh did not know of the alleged infringement — since IO did not file a DMCA Notice (2) it acted expeditiously to remove user-posted infringing content, (3) it did not have actual knowledge of infringement, (4) it was not aware of infringing activity, and (5) it did not have the right and ability to control the infringing activity (the Court did not address any financial benefit).

In sum: the Court “[did] not find that the DMCA was intended to have Veoh shoulder the entire burden of policing third-party copyrights on its website (at the cost of losing its business if it cannot). Rather, the issue [was] whether Veoh [took] appropriate steps to deal with [alleged] copyright infringement.”

There is much speculation as to how, if at all, this case will affect the Viacom / YouTube case. YouTube praised the decision, Viacom noted the differences. Each case turns on its own facts, but to the extent there are similarities, this decision is wind in YouTube’s sails.

Case is: IO Group Inc.(Plaintiff), v. Veoh Networks, Inc. (Defendant)