Tag Archives: Privacy

Casual website visitor who watched videos was not protected under the Video Privacy Protection Act

A recent federal court decision from the Southern District of New York sheds light on what is required to be considered a “consumer” who is protected under the Video Privacy Protection Act (VPPA). The court held that a website visitor who merely visited a website once in awhile to watch videos — without establishing a more “deliberate and durable” affiliation with the website — was not a “subscriber” to the website’s services and thus the VPPA did not prohibit the alleged disclosure of information about the website visitor’s viewing habits.

Defendant was a television network that maintains a website offering video clips and episodes of many of its television shows. The website also incorporated Facebook’s software development kit which, among other things, let visitors log into websites using their Facebook credentials. This mechanism relied on cookies. If a person had chosen to remain logged into Facebook by checking the “keep me logged in” button on Facebook’s homepage, the relevant cookie would continue to operate, regardless of what the user did with the web browser. Plaintiff alleged that this mechanism caused AMC to transmit information to Facebook about the video clips she watched on the AMC site.

Plaintiff sued under the VPPA. Defendant moved to dismiss, arguing that plaintiff lacked standing under the statute and that she was not a protected “consumer” as required by the statute.

The court found that plaintiff had standing. It rejected defendant’s argument that a VPPA plaintiff must allege some injury in addition to asserting that defendant had violated the statute. “It is true . . . that Congress cannot erase Article III’s standing requirements by statutorily granting the right to sue to a plaintiff who would not otherwise have standing.” But Congress “can broaden the injuries that can support constitutional standing.”

The court next looked to whether plaintiff was a “consumer” protected under the statute. The VPPA defines the term “consumer” to include “any renter, purchaser, or subscriber of goods or services from a video tape service provider.” Absent any assertion that plaintiff was a renter or purchaser of AMC’s goods, the parties and the court focused on whether she was a “subscriber” (a term not defined in the statute).

Because plaintiff’s allegations failed to establish a relationship with defendant sufficient to characterize her as a subscribers of defendant’s goods or services, the court dismissed the VPPA claim with leave to amend. It observed: “Conventionally, ‘subscription’ entails an exchange between subscriber and provider whereby the subscriber imparts money and/or personal information in order to receive a future and recurrent benefit, whether that benefit comprises, for instance, periodical magazines, club membership, cable services, or email updates.” In this case, “[s]uch casual consumption of web content, without any attempt to affiliate with or connect to the provider, exhibit[ed] none of the critical characteristics of ‘subscription’ and therefore [did] not suffice to render [plaintiff] a subscriber of [defendant’s] services.”

Austin-Spearman v. AMC Network Entertainment LLC, 2015 WL 1539052 (S.D.N.Y. April 7, 2015)

Evan Brown is an attorney in Chicago helping clients manage issues involving technology and new media.

Best practices for providers of goods and services on the Internet of Things

Today the United States Federal Trade Commission issued a report in which it detailed a number of consumer-focused issues arising from the growing Internet of Things (IoT). Companies should pay attention to the portion of the report containing the Commission’s recommendations on best practices to participants (such as device manufacturers and service providers) in the IoT space.

The Commission structured its recommendations around four of the “FIPPs” – the Fair Information Practice Principles – which first appeared in the 1970s and which inform much of the world’s regulation geared to protect personal data. The recommendations focused on data security, data minimization, notice and choice.

DATA SECURITY

IoT participants should implement reasonable data security. The Commission noted that “[o]f course, what constitutes reasonable security for a given device will depend on a number of factors, including the amount and sensitivity of data collected and the costs of remedying the security vulnerabilities.” Nonetheless, companies should:

  • Implement “security by design”
  • Ensure their personnel practices promote good security
  • Retain and oversee service providers that provide reasonable security
  • Implement “defense-in-depth” approach where appropriate
  • Implement reasonable access control measures
  • Monitor products in the marketplace and patch vulnerabilities

Security by Design

Companies should implement “security by design” into their devices at the outset, rather than as an afterthought by:

  • Conducting a privacy or security risk assessment to consider the risks presented by the collection and retention of consumer information.
  • Incorporating the use of “smart defaults” such as requiring consumers to change default passwords during the set-up process.
  • Considering how to minimize the data collected and retained.
  • Testing security measures before launching products.

Personnel Practices and Good Security

Companies should ensure their personnel practices promote good security by making security an executive-level concern and training employees about good security practices. A company should not assume that the ability to write code is equivalent to an understanding of the security of an embedded device.

Retain and Oversee Service Providers That Provide Reasonable Security

The Commission urged IoT participants to retain service providers that are capable of maintaining reasonable security and to oversee those companies’ performance to ensure that they do so. On this point, the Commission specifically noted that failure to do so could result in FTC law enforcement action. It pointed to a recent (non IoT) case in which a medical transcription company outsourced its services to independent typists in India who stored their notes in clear text on an unsecured server. Patients in the U.S. were shocked to find their confidential medical information showing up in web searches.

The “Defense-in-Depth” Approach

The Commission urged companies to take additional steps to protect particularly sensitive information (e.g., health information). For example, instead of relying on the user to ensure that data passing over his or her local wireless network is encrypted using the Wi-Fi password, companies should undertake additional efforts to ensure that data is not publicly available.

Reasonable Access Control Measures

While tools such as strong authentication could be used to permit or restrict IoT devices from interacting with other devices or systems, the Commission noted companies should ensure that they do not unduly impede the usability of the device.

Monitoring of Products and Patching of Vulnerabilities

Companies may reasonably decide to limit the time during which they provide security updates and software patches, but must weigh these decisions carefully. IoT participants should also be forthright in their representations about providing ongoing security updates and software patches to consumers. Disclosing the length of time companies plan to support and release software updates for a given product line will help consumers better understand the safe “expiration dates” for their commodity internet-connected devices.

DATA MINIMIZATION

Data minimization refers to the concept that companies should limit the data they collect and retain, and dispose of it once they no longer need it. The Commission acknowledged the concern that requiring data minimization might curtail innovative uses of data. A new enterprise may not be able to reasonably foresee the types of uses it may have for information gathered in the course of providing a connected device or operating a service in conjunction with connected devices. Despite certain concerns against data minimization, the Commission recommended that companies should consider reasonably limiting their collection and retention of consumer data.

The Commission observed how data minimization mitigates risk in two ways. First, the less information in a database, the less attractive the database is as a target for hackers. Second, having less data reduces the risk that the company providing the device or service will use the information in a way that the consumer does not expect.

The Commission provided a useful example of how data minimization might work in practice. It discussed a hypothetical startup that develops a wearable device, such as a patch, that can assess a consumer’s skin condition. The device does not need to collect precise geolocation information in order to work, but it has that capability. The device manufacturer believes that such information could be useful for a future product feature that would enable users to find treatment options in their area. The Commission observed that as part of a data minimization exercise, the company should consider whether it should wait to collect geolocation information until after it begins to offer the new product feature, at which time it could disclose the new collection and seek consent. The company should also consider whether it could offer the same feature while collecting less information, such as by collecting zip code rather than precise geolocation. If the company does decide it needs the precise geolocation information, the Commission would recommend that the company provide a prominent disclosure about its collection and use of this information, and obtain consumers’ affirmative express consent. And the company should establish reasonable retention limits for the data it does collect.

As an aspect of data minimization, the Commission also discussed de-identification as a “viable option in some contexts” to help minimize data and the risk of potential consumer harm. But as with any conversation about de-identification, the Commission addressed the risks associated with the chances of re-identification. On this note, the Commission referred to its 2012 Privacy Report in which it said that companies should:

  • take reasonable steps to de-identify the data, including by keeping up with technological developments;
  • publicly commit not to re-identify the data; and
  • have enforceable contracts in place with any third parties with whom they share the data, requiring the third parties to commit not to re-identify the data.

This approach ensures that if the data is not reasonably de-identified and then is re-identified
in the future, regulators can hold the company responsible.

NOTICE AND CHOICE

Giving consumers notice that information is being collected, and the ability to make choices as to that collection is problematic in many IoT contexts. Data is collected continuously, by many integrated devices and systems, and getting a consumer’s consent in each context might discourage use of the technology. Moreover, often there is no easy user interface through which to provide notice and offer choice.

With these concerns in mind, the Commission noted that “not every data collection requires choice.” As an alternative, the Commission acknowledged the efficacy of a use-based approach. Companies should not be compelled, for example, to provide choice before collecting and using consumer data for practices that are consistent with the context of a transaction or the company’s relationship with a consumer. By way of example, the Commission discussed a hypothetical purchaser of a “smart oven”. The company could use temperature data to recommend another of the company’s kitchen products. The consumer would expect that. But a consumer would not expect the company to disclose information to a data broker or an ad network without having been given notice of that sharing and the ability to choose whether it should occur.

Given the practical difficulty of notice and choice on the IoT, the Commission acknowledged there is no one-size-fits all approach. But it did suggest a number of mechanisms for communications of this sort, including:

  • Choices at point of sale
  • Tutorials (like the one Facebook uses)
  • QR codes on the device
  • Choices during setup
  • Management portals or dashboards
  • Icons
  • Out-of-band notifications (e.g., via email or text)
  • User-experience approach – “learning” what the user wants, and adjusting automatically

Conclusion

The Commission’s report does not have the force of law, but is useful in a couple of ways. From a practical standpoint, it serves as a guide for how to avoid engaging in flagrant privacy and security abuses on the IoT. But it also serves to frame a larger discussion about how providers of goods and services can and should approach the innovation process for the development of the Internet of Things.

Company facing liability for accessing employee’s Twitter and Facebook accounts

While plaintiff was away from the office for a serious brain injury she suffered in a work-related auto accident, some of her co-workers accessed and posted, allegedly without authorization, from her Twitter and Facebook accounts. (There was some dispute as to whether those accounts were personal to plaintiff or whether they were intended to promote the company.) Plaintiff sued, alleging several theories, including violations of the Lanham Act and the Stored Communications Act. Defendants moved for summary judgment. The court dismissed the Lanham Act claim but did not dismiss the Stored Communications Act claim.

Plaintiff had asserted a Lanham Act “false endorsement” claim, which occurs when a person’s identity is connected with a product or service in such a way that consumers are likely to be misled about that person’s sponsorship or approval of the product or service. The court found that although plaintiff had a protectable interest in her “personal brand,” she had not properly put evidence before the court that she suffered the economic harm necessary for a Lanham Act violation. The record showed that plaintiff’s alleged damages related to her mental suffering, something not recoverable under the Lanham Act.

As for the Stored Communications Act claim, the court found that the question of whether defendants were authorized to access and post using plaintiff’s social media accounts should be left up to the jury (and not determined on summary judgment). Defendants had also argued that plaintiff’s Stored Communications Act claim should be thrown out because she had not shown any actual damages. But the court held plaintiff could be entitled to the $1,000 minimum statutory damages under the act even without a showing of actual harm.

Maremont v. Susan Fredman Design Group, Ltd., 2014 WL 812401 (N.D.Ill. March 3, 2014)

Massachusetts supreme court says cops should have gotten warrant before obtaining cell phone location data

Court takes a “different approach” with respect to one’s expectation of privacy

After defendant’s girlfriend was murdered in 2004, the police got a “D order” (an order authorized under 18 U.S.C. 2703(d)) from a state court to compel Sprint to turn over historical cell site location information (“CSLI”) showing where defendant placed telephone calls around the time of the girlfriend’s murder. Importantly, the government did not get a warrant for this information. After the government indicted defendant seven years later, he moved to suppress the CSLI evidence arguing a violation of his Fourth Amendment rights. The trial court granted the motion to suppress, and the government sought review with the Massachusetts supreme court. That court agreed, holding that a search warrant based on probable cause was required.

The government invoked the third party doctrine, arguing that no search in the constitutional sense occurred because CSLI was a business record of the defendant’s cellular service provider, a private third party. According to the government, the defendant could thus have no expectation of privacy in location information — i.e., information about the his location when using the cell phone — that he voluntarily revealed.

The court concluded that although the CSLI at issue was a business record of the defendant’s cellular service provider, he had a reasonable expectation of privacy in it, and in the circumstances of this case — where the CSLI obtained covered a two-week period — the warrant requirement of the Massachusetts constitution applied. The court made a qualitative distinction in cell phone location records to reach its conclusion:

No cellular telephone user . . . voluntarily conveys CSLI to his or her cellular service provider in the sense that he or she first identifies a discrete item of information or data point like a telephone number (or a check or deposit slip…) … In sum, even though CSLI is business information belonging to and existing in the records of a private cellular service provider, it is substantively different from the types of information and records contemplated by [the Supreme Court’s seminal third-party doctrine cases]. These differences lead us to conclude that for purposes of considering the application of [the Massachusetts constitution] in this case, it would be inappropriate to apply the third-party doctrine to CSLI.

To get to this conclusion, the court avoided the question of whether obtaining the records constituted a “search” under the Fourth Amendment, but focused instead on the third party doctrine (and the expectation of privacy one has in information stored on a third party system) in relation to the Massachusetts constitution.

In a sense, though, the court gave the government another bite at the apple. It remanded the case to the trial court where the government could seek to establish that the affidavit submitted in support of its application for an order under 18 U.S.C. § 2703(d) demonstrated probable cause for the CSLI records at issue.

Commonwealth v. Augustine, — N.E.3d —, Mass. , 2014 WL 563258 (Mass. February 18, 2014)

Hulk Hogan sex tape redux: Another court holds Gawker had First Amendment right to publish video excerpts

As we discussed here on internetcases back in November 2012, someone surreptitiously filmed Hulk Hogan engaged in sex acts with someone other than his wife. When Gawker posted an article and video excerpts about that, Hulk sued in federal court for invasion of privacy. The federal court denied the preliminary injunction, holding that to bar Gawker from publishing the information would be an unconstitutional prior restraint on speech.

hulk_hogan_tapeA few weeks after the federal court denied his motion for preliminary injunction, Hulk voluntarily dismissed the federal case and filed a new case in state court. Unlike the federal court, the state court granted a preliminary injunction against Gawker publishing the information and the video excerpts. Gawker sought review with the Court of Appeal of Florida. On appeal, the court reversed the lower court’s order granting the preliminary injunction.

The state appellate court’s decision closely tracked the federal court’s reasoning from 2012. The court observed that where matters of purely private significance are at issue, First Amendment protections are often less rigorous. But speech on matters of public concern is “at the heart of the First Amendment’s protection.”

The court found that the sex tape excerpts and information that Gawker published were matters of public concern. Much of this was from Hulk’s own doing — he injected himself into the public spotlight not only as a professional wrestler, but also through books detailing his sexual indiscretions, radio interviews, and other public pronouncements about his “conquests.”

In arguing that Gawker’s speech was not of public concern, Hulk looked to Michaels v. Internet Entertainment Group, Inc., 5 F.Supp.2d 823 (C.D.Cal.1998), a case that dealt with the infamous sex tape that Bret Michaels and Pamela Anderson made. In that case, the court found defendant’s redistribution of the video was not protected by the First Amendment, in part because the distribution was purely commercial. The court didn’t buy it.

But wasn’t Gawker’s use commercial as well? The court drew a distinction:

We are aware that Gawker Media is likely to profit indirectly from publishing the report with video excerpts to the extent that it increases traffic to Gawker Media’s website. However, this is distinguishable from selling the [Hulk] Sex Tape purely for commercial purposes.

So the court found that despite his brawn, Hulk failed to carry his “heavy burden” of overcoming the presumption that a preliminary injunction would violate the First Amendment in this situation.

Gawker Media, LLC v. Bollea, 2014 WL 185217 (Fla.App. 2 Dist., January 17, 2014)

Evan Brown is a Chicago attorney helping businesses and individuals identify and manage issues dealing with technology development, copyright, trademarks, software licensing and many other matters involving the internet and new media. Call him at (630) 362-7237 or email ebrown@internetcases.com.

Is the future a trade between convenience and privacy?

This TechCrunch piece talks about how (predictably) Google wants to build the “ultimate personal assistant.” With Google’s collecting user preferences cross-platform and applying algorithms to ascertain intentions, getting around in the world, purchasing things, and interacting with others could get a lot easier.

But at what cost? The success of any platform that becomes a personal assistant in the cloud would depend entirely on the collection of vast amounts of information about the individual. And since Google makes its fortunes on advertising, there is no reason to be confident that the information gathered will not be put to uses other than adding conveniences to the user’s life. Simply stated, the platform is privacy-destroying.

What if one wants to opt-out of this utopically convenient future? Might such a person be unfairly disadvantaged by, for example, choosing to undertake tasks the “old fashioned” way, unassisted by the privacy eviscerating tools? This points to larger questions about augmented reality. As a society, will we implement regulations to level the playing field among those who are not augmented versus those who are? Questions of social justice in the future may take a different tone.

Can an LLC member violate the Stored Communications Act by accessing other members’ email?

Yes.

Two members of an LLC sued another member and the company’s manager of information services alleging violation of the Stored Communications Act, 28 U.S.C. 2701 et seq. Defendants moved to dismiss for failure to state a claim. The court denied the motion.

Plaintiffs alleged that the LLC’s operating agreement required “Company decisions” to be made based on four of the five members voting in favor. The company had no policy in place authorizing the search and review of employees’ email messages, nor did it inform employees that their email may be accessed. Plaintiffs did not consent to their emails being searched and reviewed.

In connection with a dispute among the LLC members, one of them allegedly (in cooperation with the manager of information services) accessed the company’s email server using administrative credentials. She allegedly performed over 2,000 searches, retrieving other members’ communications of a personal nature, as well as communications with those members’ legal counsel.

Defendants moved to dismiss under 12(b)(6), arguing that plaintiffs could not show the access was unauthorized. Defendants argued that there was no electronic trespass, as the access was accomplished simply by company procedure.

The court rejected defendants’ arguments, finding that plaintiffs had sufficiently alleged an SCA violation, since plaintiffs had not consented to the access, and because no policy existed permitting an individual to search and review emails of members or employees absent the four-fifths approval required by the operating agreement.

Joseph v. Carnes, 2013 WL 2112217 (N.D.Ill. May 14, 2013)

Jury finds in favor of IMDb in case brought by actress over published age

Hoang v. IMDb.com, No. 11-1709, W.D.Wash. (Jury verdict April 11, 2013)

Actress Junie Hoang was upset that IMDb published her real age (she was born in 1971). She sued IMDb claiming it breached its Subscriber Agreement (particularly its privacy policy) by using information she provided to cross-reference public records, and thereby ascertaining her correct age.

The case went to trial on the breach of contract claim. The jury returned a verdict in favor of IMDb.

Though we don’t know the jury’s thinking (we only have a simple verdict form), IMDb had argued, among other things, that its investigations of plaintiff’s birthday were in response to requests she had made. In 2008, plaintiff had asked IMDb to remove a false (1978) birthdate plaintiff had submitted a few years earlier. When IMDb conducted its own research, it found plaintiff’s real birthdate in public records, and published that. The jury found this did not violate IMDB’s Subscriber Agreement.

Email privacy is weak even with court oversight

Huntington Ingalls Inc. v. Doe, 2012 WL 5897483 (N.D. Cal. November 21, 2012)

A federal court in California has allowed a party to subpoena Google to learn the identity of a Gmail account owner, even though that owner did nothing to involve himself in the dispute.

A contractor that plaintiff hired accidentally emailed “property” belonging to plaintiff to the wrong email address. (The court’s opinion is not clear on the nature of this “property,” but we are safe in assuming it was some sort of proprietary information.) Plaintiff sent messages to the Gmail account seeking return of the property, but the unknown account owner did not respond.

Plaintiff filed suit in federal court against the anonymous account holder (John Doe) seeking declaratory and injunctive relief (i.e., to get the property back). Since plaintiff did not know Doe’s identity, it sought expedited discovery so that it could subpoena Google for the identifying information.

email

The court granted the motion for leave to send the subpoenas. It found that:

  • without the subpoena, plaintiff would have no other way to obtain “this most basic information”
  • the subpoena was the exclusive means available to plaintiff to protect its property interest
  • plaintiff’s proposed procedure guarded Doe’s due process rights by requiring Google to give Doe notice of the subpoena and an opportunity to object

The court’s opinion shows how any privacy interest in one’s email account information is tenuous at best. In this situation, the target of the unmasking efforts was, as they say, minding his own business, not doing anything to inject himself into any dispute.

Moreover, unlike many previous cases in which courts have required the party seeking discovery of an anonymous party’s identity to put forth facts showing it has a good case, there was no claim here that Doe did anything wrong. Instead, it was the sender’s mistake. One could find it unsettling to know that other peoples’ errors could cause a court to order his or her identity to be publicly revealed.

Photo courtesy Flickr user Bart Heird under this Creative Commons license.