Newspaper not liable for alleged defamatory letter to editor published online

The Appellate Court of Illinois has sided in favor of a local newspaper in a defamation lawsuit brought against the paper over a reader’s allegedly defamatory letter to the editor. The court held that the Communciations Decency Act (at 47 U.S.C. 230) “absolved” the newspaper of liability over this appearance of third party content on the newspaper’s website.

Plaintiff — a lawyer and self-identified civil rights advocate — sent several letters to local businesses claiming those businesses did not have enough handicapped parking spaces. Instead of merely asking the businesses to create those parking spaces, he demanded each one pay him $5,000 or face a lawsuit.

One local resident thought plaintiff’s demands were greedy and extortionate, and wrote a letter to the editor of the local newspaper covering the story. The newspaper posted the letter online. Both the newspaper and the letter’s author found themselves as defendants in plaintiff’s defamation lawsuit.

The letter-writer settled with plaintiff, but the newspaper stayed in as a defendant and moved to dismiss, arguing that federal law immunized it from liability for content provided by the third party letter-writer.

The lower court dismissed the defamation claim against the newspaper, holding that the Communications Decency Act (at 47 U.S.C. §230) protected the newspaper from liability for the third party letter-writer’s comments posted on the newspaper’s website.

Plaintiff sought review with the Appellate Court of Illinois. On appeal, the court affirmed the dismissal.

The Communications Decency Act (at 47 U.S.C §230(c)(1)) says that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The appellate court found that the leter-writer was another information content provider that placed comments on the newspaper’s website. Therefore, it held that the Communications Decency Act “absolved” the newspaper from responsibility.

Straw v. Streamwood Chamber of Commerce, 2015 IL App (1st) 143094-U (September 29, 2015)

Evan Brown is an attorney in Chicago helping clients manage issues involving technology and new media.

Third Circuit upholds Communications Decency Act immunity for Google, Yahoo and others

Plaintiff filed suit against Google, Yahoo and some unknown (John Doe) defendants for defamation, tortious interference with contract, and negligent and intentional infliction of emotional distress based on various online postings. The district court dismissed the complaint, holding that the Communications Decency Act (47 U.S.C. §230) provided immunity to defendants over the third party content giving rise to the complaint. Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because defendants were not the creators of the information, and the claims attempted to treat them as the publisher or speaker of that content, Section 230 barred the claims.

Kabbaj v. Google, Inc., 2015 WL 534864 (3rd Cir. Feb. 10, 2015)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

When is it okay to use social media to make fun of people?

There is news from California that discusses a Facebook page called 530 Fatties that was created to collect photos of and poke fun at obese people. It’s a rude project, and sets the context for discussing some intriguing legal and normative issues.

Apparently the site collects photos that are taken in public. One generally doesn’t have a privacy interest in being photographed while in public places. And that seems pretty straightforward if you stop and think about it — you’re in public after all. But should technology change that legal analysis? Mobile devices with good cameras connected to high speed broadband networks make creation, sharing and shaming much easier than it used to be. A population equipped with these means essentially turns all public space into a panopticon. Does that mean the individual should be given more of something-like-privacy when in public? If you think that’s crazy, consider it in light of what Justice Sotomayor wrote in her concurrence in the 2012 case of U.S. v. Jones: “I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables [one] to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.”

Apart from privacy harms, what else is at play here? For the same reasons that mobile cameras + social media jeopardizes traditional privacy assurances, the combination can magnify the emotional harms against a person. The public shaming that modern technology occasions can inflict deeper wounds because of the greater spatial and temporal characteristics of the medium. One can now easily distribute a photo or other content to countless individuals, and since the web means the end of forgetting, that content may be around for much longer than the typical human memory.

Against these concerns are the free speech interests of the speaking parties. In the U.S. especially, it’s hardwired into our sensibilities that each of us has great freedom to speak and otherwise express ourselves. The traditional First Amendment analysis will protect speech — even if it offends — unless there is something truly unlawful about it. For example, there is no free speech right to defame, to distribute obscene materials, or to use “fighting words.” Certain forms of harassment fall into the category of unprotected speech. How should we examine the role that technology plays in moving what would otherwise be playground-like bullying (like calling someone a fatty) to unlawful speech that can subject one to civil or even criminal liability? Is the impact that technology’s use makes even a valid issue to discuss?

Finally, we should examine the responsibility of the intermediaries here. A social media platform generally is going to be protected by the Communications Decency Act at 47 USC 230 from liability for third party content. But we should discuss the roles of the intermediary in terms other than pure legal ones. Many social media platforms are proactive in taking down otherwise lawful content that has the tendency to offend. The pervasiveness of social media underscores the power that these platforms have to shape normative values around what is appropriate behavior among individuals. This power is indeed potentially greater than any legal or governmental power to constrain the generation and distribution of content.

Evan Brown is an attorney in Chicago advising clients on matters dealing with technology, the internet and new media.

No Section 230 immunity for healthcare software provider

Company could be liable for modifications made to its software that provided abbreviated third-party warnings for prescription drugs.

Cases dealing with the Communications Decency Act often involve websites. See, for example, the recent decision from the Sixth Circuit involving thedirty.com, and earlier cases about Roommates.com and Amazon. But this case considered a sort of unique suggested application of Section 230 immunity. The question was whether a provider of software that facilitated the delivery of prescription monographs (including warning information) could claim immunity. It’s unusual for Section 230 to show up in a products liability/personal injury action, but that is how it happened here.

Plaintiff suffered blindness and other injuries allegedly from taking medication she says she would not have taken had it been accompanied with certain warnings. She sued several defendants, including a software company that provided the technology whereby warnings drafted by third parties were provided to pharmacy retailers.

Defendant software company moved to dismiss on several grounds, including immunity under the Communications Decency Act, 47 U.S.C. 230. The trial court denied the motion to dismiss and defendant sought review. On appeal, the court affirmed the denial of the motion to dismiss, holding that Section 230 immunity did not apply.

At the request of the retailer that sold plaintiff her medicine, defendant software company modified its software to provide only abbreviated product warnings. Plaintiff’s claims against defendant arose from that modification.

Defendant argued that Section 230 immunity should protect it because defendant did not play any role in the decisions of the product warning. Instead, defendant was an independent provider of software that distributed drug information to pharmacy customers. Its software enabled pharmacies to access a third party’s database of product warnings. Defendant did not author the warnings but instead, provided the information under an authorization in a data license agreement. Defendant thus functioned as a pass through entity to distribute warnings that were prepared by third parties to retailers selling prescription drugs, and were printed and distributed to the individual customer when a prescription was filled.

The court found unpersuasive defendant’s claim that Section 230 immunized it from liability for providing electronic access to third party warnings. Section 230 provides, in relevant part, that (1) “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” and (2) “[n]o cause of action may be brought and no liability may be imposed under any State or local rule that is inconsistent with this section.”

It held that plaintiff’s claim against defendant did not arise from defendant’s role as the software or service provider that enabled the retailer to access the third-party drafted warnings. Instead, the court found that plaintiff’s claim arose from defendant’s modification of its software to allow the retailer to distribute abbreviated drug monographs that automatically omitted warnings of serious risks. The appellate court agreed with the trial court which found, “this is not a case in which a defendant merely distributed information from a third party author or publisher.” Instead, in the court’s view, defendant’s conduct in modifying the software so that only abbreviated warnings would appear, it participated in creating or modifying the content.

Hardin v. PDX, Inc., 2014 WL 2768863 (Cal. App. 1st June 19, 2014)

Sixth Circuit holds thedirty.com entitled to Section 230 immunity

Plaintiff Jones (a high school teacher and Cincinnati Bengals cheerleader) sued the website thedirty.com and its operator for defamation over a number of third party posts that said mean things about plaintiff. Defendants moved for summary judgment, arguing that the Communications Decency Act — 47 USC § 230(c)(1) — afforded them immunity from liability for the content created by third parties. Articulating a “goofy legal standard,” the district court denied the motion, and the case was tried twice. The first trial ended in a mistrial, and the second time the jury found in favor of plaintiff.

Defendants sought review with the Sixth Circuit Court of Appeals on the issue of whether whether the district court erred in denying defendants’ motion for judgment as a matter of law by holding that the CDA did not bar plaintiff’s state tort claims. On appeal, the court reversed the district court and ordered that judgment as a matter of law be entered in defendants’ favor.

Section 230(c)(1) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” At its core, § 230 grants immunity to defendant service providers in lawsuits seeking to hold the service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content.

But the grant of immunity is not without limits. It applies only to the extent that an interactive computer service provider is not also the information content provider of the content at issue. A defendant is not entitled to protection from claims based on the publication of information if the defendant is responsible, in whole or in part, for the creation or development of the information.

The district court held that “a website owner who intentionally encourages illegal or actionable third-party postings to which he adds his own comments ratifying or adopting the posts becomes a ‘creator’ or ‘developer’ of that content and is not entitled to immunity.” Thus, the district court concluded that “[d]efendants, when they re-published the matters in evidence, had the same duties and liabilities for re-publishing libelous material as the author of such materials.”

The appellate court held that the district court’s test for what constitutes “creation” or “development” was too broad. Instead, the court looked to the Ninth Circuit’s decision in Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) and adopted the material contribution test from that opinion:

[W]e interpret the term “development” as referring not merely to augmenting the content generally, but to materially contributing to its alleged unlawfulness. In other words, a website helps to develop unlawful content, and thus falls within the exception to section 230, if it contributes materially to the alleged illegality of the conduct.

In the Sixth Circuit’s language, “[A] material contribution to the alleged illegality of the content does not mean merely taking action that is necessary to the display of allegedly illegal content. Rather, it means being responsible for what makes the displayed content allegedly unlawful.”

In this case, the defendants did not author the statements at issue. But they did select the statements for publication. The court held that defendants did not materially contribute to the defamatory content of the statements simply because those posts were selected for publication. Moreover, the website did not require users to post illegal or actionable content as a condition of use. The website’s content submission form simply instructed users generally to submit content. The court found the tool to be neutral (both in orientation and design) as to what third parties submit. Accordingly, the website design did not constitute a material contribution to any defamatory speech that was uploaded.

Jones v. Dirty World, No. 13-5946 (6th Cir. June 16, 2014)

Evan Brown is an attorney in Chicago advising clients on matters dealing with technology, the internet and new media. Contact him.

Website operators not liable for third party comments

Spreadbury v. Bitterroot Public Library, 2012 WL 734163 (D. Montana, March 6, 2012)

Plaintiff was upset at some local government officials, and ended up getting arrested for allegedly trespassing at the public library. Local newspapers covered the story, including on their websites. Some online commenters said mean things about plaintiff, so plaintiff sued a whole slew of defendants, including the newspapers (as website operators).

The court threw out the claims over the online comments. It held that the Communications Decency Act at 47 U.S.C. 230 immunized the website operators from liability over the third party content.

Defendant argued that the websites were not protected by Section 230 because they were not “providers of interactive computer services” of the same ilk as AOL and Yahoo. The court soundly rejected that argument. It found that the websites provided a “neutral tool” and offered a “simple generic prompt” for subscribers to comment about articles. The website operators did not develop or select the comments, require or encourage readers to make defamatory statements, or edit comments to make them defamatory.

Six interesting technology law issues raised in the Facebook IPO

Patent trolls, open source, do not track, SOPA, PIPA and much, much more: Facebook’s IPO filing has a real zoo of issues.

The securities laws require that companies going public identify risk factors that could adversely affect the company’s stock. Facebook’s S-1 filing, which it sent to the SEC today, identified almost 40 such factors. A number of these risks are examples of technology law issues that almost any internet company would face, particularly companies whose product is the users.

(1) Advertising regulation. In providing detail about the nature of this risk, Facebook mentions “adverse legal developments relating to advertising, including legislative and regulatory developments” and “the impact of new technologies that could block or obscure the display of our ads and other commercial content.” Facebook is likely concerned about the various technological and legal restrictions on online behavioral advertising, whether in the form of mandatory opportunities for users to opt-out of data collection or or the more aggressive “do not track” idea. The value of the advertising is of course tied to its effectiveness, and any technological, regulatory or legislative measures to enhance user privacy is a risk to Facebook’s revenue.

(2) Data security. No one knows exactly how much information Facebook has about its users. Not only does it have all the content uploaded by its 845 million users, it has the information that could be gleaned from the staggering 100 billion friendships among those users. [More stats] A data breach puts Facebook at risk of a PR backlash, regulatory investigations from the FTC, and civil liability to its users for negligence and other causes of action. But Facebook would not be left without remedy, having in its arsenal civil actions under the Computer Fraud and Abuse Act and the Stored Communications Act (among other laws) against the perpetrators. It is also likely the federal government would step in to enforce the criminal provisions of these acts as well.

(3) Changing laws. The section of the S-1 discussing this risk factor provides a laundry list of the various issues that online businesses face. Among them: user privacy, rights of publicity, data protection, intellectual property, electronic contracts, competition, protection of minors, consumer protection, taxation, and online payment services. Facebook is understandably concerned that changes to any of these areas of the law, anywhere in the world, could make doing business more expensive or, even worse, make parts of the service unlawful. Though not mentioned by name here, SOPA, PIPA, and do-not-track legislation are clearly in Facebook’s mind when it notes that “there have been a number of recent legislative proposals in the United States . . . that would impose new obligations in areas such as privacy and liability for copyright infringement by third parties.”

(4) Intellectual property protection. The company begins its discussion of this risk with a few obvious observations, namely, how the company may be adversely affected if it is unable to secure trademark, copyright or patent registration for its various intellectual property assets. Later in the disclosure, though, Facebook says some really interesting things about open source:

As a result of our open source contributions and the use of open source in our products, we may license or be required to license innovations that turn out to be material to our business and may also be exposed to increased litigation risk. If the protection of our proprietary rights is inadequate to prevent unauthorized use or appropriation by third parties, the value of our brand and other intangible assets may be diminished and competitors may be able to more effectively mimic our service and methods of operations.

(5) Patent troll lawsuits. Facebook notes that internet and technology companies “frequently enter into litigation based on allegations of infringement, misappropriation, or other violations of intellectual property or other rights.” But it goes on to give special attention to those “non-practicing entities” (read: patent trolls) “that own patents and other intellectual property rights,” which “often attempt to aggressively assert their rights in order to extract value from technology companies.” Facebook believes that as its profile continues to rise, especially in the glory of its IPO, it will increasingly become the target of patent trolls. For now it does not seem worried: “[W]e do not believe that the final outcome of intellectual property claims that we currently face will have a material adverse effect on our business.” Instead, those endeavors are a suck on resources: “[D]efending patent and other intellectual property claims is costly and can impose a significant burden on management and employees….” And there is also the risk that these lawsuits might turn out badly, and Facebook would have to pay judgments, get licenses, or develop workarounds.

(6) Tort liability for user-generated content. Facebook acknowledges that it faces, and will face, claims relating to information that is published or made available on the site by its users, including claims concerning defamation, intellectual property rights, rights of publicity and privacy, and personal injury torts. Though it does not specifically mention the robust immunity from liability over third party content provided by 47 U.S.C. 230, Facebook indicates a certain confidence in the protections afforded by U.S. law from tort liability. It is the international scene that gives Facebook concern here: “This risk is enhanced in certain jurisdictions outside the United States where our protection from liability for third-party actions may be unclear and where we may be less protected under local laws than we are in the United States.”

You have to hand it to the teams of professionals who have put together Facebook’s IPO filing. I suppose the billions of dollars at stake can serve as a motivation for thoroughness. In any event, the well-articulated discussion of these risks in the S-1 is an interesting read, and can serve to guide the many lesser-valued companies out there.

Video: my appearance on the news talking about isanyoneup.com

Last night I appeared in a piece that aired on the 9 o’clock news here in Chicago, talking about the legal issues surrounding isanyoneup.com. (That site is definitely NSFW and I’m not linking to it because it doesn’t deserve the page rank help.) The site presents some interesting legal questions, like whether and to what extent it is shielded by Section 230 of the Communications Decency Act for the harm that arises from the content it publishes (I don’t think it is shielded completely). The site also engages in some pretty blatant copyright infringement, and does not enjoy safe harbor protection under the Digital Millennium Copyright Act.

Here’s the video:

Amazon and other booksellers off the hook for sale of Obama drug use book

Section 230 of the Communications Decency Act shields Amazon, Barnes & Noble and Books-A-Million from some, but not all claims brought over promotion and sale of scandalous book about presidential candidate.

Parisi v. Sinclair, — F.Supp.2d —, 2011 WL 1206193 (D.D.C. March 31, 2011)

In 2008, Larry Sinclair made the ultra-scandalous claim that he had done drugs and engaged in sexual activity with then-presidential candidate Barack Obama. Daniel Parisi, owner of the infamous Whitehouse.com website, challenged Sinclair to take a polygraph test.

Not satisfied with the attention his outlandish claims had garnered, Sinclair self-published a book detailing his alleged misadventures. The book was available through print-on-demand provider Lightening Source.

Amazon, Barnes & Noble, and Books-A-Million (“BAM”) each offered Sinclair’s book for sale through their respective websites. (Barnes & Noble and BAM did not sell the book at their brick and mortar stores.) Each company’s website promoted the book using the following sentence:

You’ll read how the Obama campaign used internet porn king Dan Parisi and Ph.D. fraud Edward I. Gelb to conduct a rigged polygraph exam in an attempt to make the Sinclair story go away.

Parisi and his Whitehouse Network sued for, among other things, defamation and false light invasion of privacy. BAM moved to dismiss pursuant to Rule 12(b)(6) while Amazon and Barnes & Noble moved for summary judgment. The court granted the booksellers’ motions.

Section 230 applied because booksellers were not information content providers

The booksellers’ primary argument was that Section 230 of the Communications Decency Act shielded them from liability for plaintiffs’ claims concerning the promotional sentence. The court found in defendants’ favor on this point.

Section 230 provides in relevant part that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The major issue in this case was whether the online booksellers had provided the information comprising the promotional sentence. The court found that the pleadings (as to BAM) and the evidence (as to Amazon and Barnes & Noble) did not credibly dispute that the booksellers did not create and develop the promotional sentence.

But not so fast, Section 230, on some of those other claims!

The court’s treatment of Section 230 in relation to plaintiffs’ false light claim and the claims relating to the actual sale of the book were even more intriguing.

Plaintiffs argued that their false light claim was essentially a right of publicity claim. And Section 230(e)(2) says that immunity does not apply to claims pertaining to intellectual property. There is some confusion as to whether this exception to immunity applies only to federal intellectual property claims or to both federal and state IP claims. On one hand, Perfect 10, Inc. v. CCBill says that only federal intellectual property claims are excepted from immunity (which would mean that state law IP claims would be barred by Section 230). On the other hand, cases like Atlantic Recording Corp. v. Project Playlist, Doe v. Friendfinder Network and Universal Communication System v. Lycos suggest that both state and federal IP claims should withstand a Section 230 challenge.

In this case, the court indicated that it would have sided with the cases that provide for both federal and state claims making it past Section 230: “I am not inclined to extend the scope of the CDA immunity as far as the Ninth Circuit. . . . ”

But ultimately the court did not need to take sides as to the scope of Section 230(e)(2), as it found the use of plaintiff Parisi’s name fit into the newsworthiness privilege. One cannot successfully assert a misappropriation claim when his name or likeness is used in a newsworthy publication unless the use has “no real relationship” to the subject matter of the publication.

The court also seemed to constrain Section 230 immunity as it related to the online booksellers’ liability for selling the actual book. (Remember, the discussion above, in which the court found immunity to apply, dealt with the promotional sentence.) The court rejected defendants’ arguments that the reasoning of Gentry v. eBay should protect them. In Gentry, eBay was afforded immunity from violation of a warranty statute. But it merely provided the forum for the sale of goods, unlike the online booksellers in this case, which were the distributors of the actual allegedly defamatory book.

Even though Section 230 did not serve to protect BAM, Barnes & Noble and Amazon from liability for defamation arising from sales of the book, the court dismissed the defamation claim because of the lack of a showing that the booksellers acted with actual malice. It was undisputed that the plaintiffs were limited-purpose public figures. Persons with that status must show that the defendant acted with actual malice. That standard was not met here.

Section 230 shields Google from liability for anonymous defamation

Black v. Google Inc., 2010 WL 3746474 (N.D.Cal. September 20, 2010)

Back in August, the U.S. District Court for the Northern District of California dismissed a lawsuit against Google brought by two pro se plaintiffs, holding that the action was barred under the immunity provisions of 47 USC 230. That section says that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Plaintiffs had complained about a comment on Google (probably a review) disparaging their roofing business.

Plaintiffs filed and “objection” to the dismissal, which the court read as a motion to alter or amend under Fed. R. Civ. P. 59. The court denied plaintiffs’ motion.

In their “objection,” plaintiffs claimed — apparently without much support — that Congress did not intend Section 230 to apply in situations involving anonymous speech. The court did not buy this argument.

The court looked to the Ninth Circuit case of Carafano v. Metrosplash as an example of a website operator protected under Section 230 from liability for anonymous content: “To be sure, the website [in Carafano] provided neutral tools, which the anonymous dastard used to publish the libel, but the website did absolutely nothing to encourage the posting of defamatory content.” As in Carafano, Google was a passive conduit and could not be liable for failing to detect and remove the allegedly defamatory content.

Posts navigation

1 2 3 4