Company facing liability for accessing employee’s Twitter and Facebook accounts

While plaintiff was away from the office for a serious brain injury she suffered in a work-related auto accident, some of her co-workers accessed and posted, allegedly without authorization, from her Twitter and Facebook accounts. (There was some dispute as to whether those accounts were personal to plaintiff or whether they were intended to promote the company.) Plaintiff sued, alleging several theories, including violations of the Lanham Act and the Stored Communications Act. Defendants moved for summary judgment. The court dismissed the Lanham Act claim but did not dismiss the Stored Communications Act claim.

Plaintiff had asserted a Lanham Act “false endorsement” claim, which occurs when a person’s identity is connected with a product or service in such a way that consumers are likely to be misled about that person’s sponsorship or approval of the product or service. The court found that although plaintiff had a protectable interest in her “personal brand,” she had not properly put evidence before the court that she suffered the economic harm necessary for a Lanham Act violation. The record showed that plaintiff’s alleged damages related to her mental suffering, something not recoverable under the Lanham Act.

As for the Stored Communications Act claim, the court found that the question of whether defendants were authorized to access and post using plaintiff’s social media accounts should be left up to the jury (and not determined on summary judgment). Defendants had also argued that plaintiff’s Stored Communications Act claim should be thrown out because she had not shown any actual damages. But the court held plaintiff could be entitled to the $1,000 minimum statutory damages under the act even without a showing of actual harm.

Maremont v. Susan Fredman Design Group, Ltd., 2014 WL 812401 (N.D.Ill. March 3, 2014)

Massachusetts supreme court says cops should have gotten warrant before obtaining cell phone location data

Court takes a “different approach” with respect to one’s expectation of privacy

After defendant’s girlfriend was murdered in 2004, the police got a “D order” (an order authorized under 18 U.S.C. 2703(d)) from a state court to compel Sprint to turn over historical cell site location information (“CSLI”) showing where defendant placed telephone calls around the time of the girlfriend’s murder. Importantly, the government did not get a warrant for this information. After the government indicted defendant seven years later, he moved to suppress the CSLI evidence arguing a violation of his Fourth Amendment rights. The trial court granted the motion to suppress, and the government sought review with the Massachusetts supreme court. That court agreed, holding that a search warrant based on probable cause was required.

The government invoked the third party doctrine, arguing that no search in the constitutional sense occurred because CSLI was a business record of the defendant’s cellular service provider, a private third party. According to the government, the defendant could thus have no expectation of privacy in location information — i.e., information about the his location when using the cell phone — that he voluntarily revealed.

The court concluded that although the CSLI at issue was a business record of the defendant’s cellular service provider, he had a reasonable expectation of privacy in it, and in the circumstances of this case — where the CSLI obtained covered a two-week period — the warrant requirement of the Massachusetts constitution applied. The court made a qualitative distinction in cell phone location records to reach its conclusion:

No cellular telephone user . . . voluntarily conveys CSLI to his or her cellular service provider in the sense that he or she first identifies a discrete item of information or data point like a telephone number (or a check or deposit slip…) … In sum, even though CSLI is business information belonging to and existing in the records of a private cellular service provider, it is substantively different from the types of information and records contemplated by [the Supreme Court’s seminal third-party doctrine cases]. These differences lead us to conclude that for purposes of considering the application of [the Massachusetts constitution] in this case, it would be inappropriate to apply the third-party doctrine to CSLI.

To get to this conclusion, the court avoided the question of whether obtaining the records constituted a “search” under the Fourth Amendment, but focused instead on the third party doctrine (and the expectation of privacy one has in information stored on a third party system) in relation to the Massachusetts constitution.

In a sense, though, the court gave the government another bite at the apple. It remanded the case to the trial court where the government could seek to establish that the affidavit submitted in support of its application for an order under 18 U.S.C. § 2703(d) demonstrated probable cause for the CSLI records at issue.

Commonwealth v. Augustine, — N.E.3d —, Mass. , 2014 WL 563258 (Mass. February 18, 2014)

Is the future a trade between convenience and privacy?

This TechCrunch piece talks about how (predictably) Google wants to build the “ultimate personal assistant.” With Google’s collecting user preferences cross-platform and applying algorithms to ascertain intentions, getting around in the world, purchasing things, and interacting with others could get a lot easier.

But at what cost? The success of any platform that becomes a personal assistant in the cloud would depend entirely on the collection of vast amounts of information about the individual. And since Google makes its fortunes on advertising, there is no reason to be confident that the information gathered will not be put to uses other than adding conveniences to the user’s life. Simply stated, the platform is privacy-destroying.

What if one wants to opt-out of this utopically convenient future? Might such a person be unfairly disadvantaged by, for example, choosing to undertake tasks the “old fashioned” way, unassisted by the privacy eviscerating tools? This points to larger questions about augmented reality. As a society, will we implement regulations to level the playing field among those who are not augmented versus those who are? Questions of social justice in the future may take a different tone.

1 2 3 4 5 6 28 29 30