Tuesday, January 24, 2017

Intimate large parties and the duty to protect privacy



I had to take a blog break over the holidays in order to get a hefty book read and to write a review of it.  I’ll post on that when it comes closer to publication.  Meanwhile, my, how the world has changed!  Let me kick off the new year with a look at some related developments in privacy law.

As Marion Oswald of the University of Winchester wrote recently for the journal of Information Communication & Technology Law (open source), to paraphrase, privacy ain’t what it used to be.  Oswald opened with a quote from The Great Gatsby, so it goes without saying that that needs to be reiterated here.  She wrote,

At one of the Great Gatsby’s spectacular parties, the golf champion Jordan Baker remarked to Nick Carraway that she likes large parties: “They’re so intimate. At small parties there isn’t any privacy.”

From that paradox, Oswald builds the case that privacy must be redefined to protect individuals in the digital world.  She observes the inadequacy of the “reasonable expectation of privacy” (REP) test—the U.S. Fourth Amendment standard—given the objective test’s tendency to drive itself to extinction in a world of objectively diminishing privacy.  Kade Crockford with the ACLU of Massachusetts articulates this point brilliantly in her lectures.  Oswald is not the first to reach her conclusion, but she does so compellingly.

Two recent cases, from Pennsylvania and Massachusetts, reached different conclusions on the question of a corporate defendant’s duty to safeguard private data.  The cases show the struggle under way in U.S. courts to do just what Oswald proposed—to redefine privacy in the digital age.  The United States is increasingly at odds with Europe, and for that matter the rest of the world, on this question.  Heralded as a modern human right in Europe, data protection is a burgeoning global legal field—and corporate obligation.

Duty

First, a quick primer on duty in U.S. tort law.

Tort law in the United States usually provides for a “duty” by “default” in negligence—that is, all persons owe to all other a persons a duty to exercise reasonable care (or not to act negligently), to avert harm to all others.  But the default rule of duty is subject to some important limitations.   

One limitation is the economic loss rule, which circumscribes negligence liability.  The rule precludes a plaintiff’s action for nonphysical, economic injury alone.  There are plenty of exceptions to the rule, and some scholars even think it’s not really a rule at all.  For example, negligent misrepresentation, which is like fraud but without intent, can be supported by economic loss within the context and expectations of a business relationship.

Defamation and privacy torts can generate what looks like economic injury, but really are animated by their own, sui generis classes of damages to reputation and personality.  U.S. privacy torts push in the European direction, but generally do not protect data voluntarily disclosed to third parties, such as employers and banks—a relation of the REP problem.  That means no protection in privacy torts for financial data, even though it’s the stuff of identity theft.

The other limitation on duty by default is that U.S. law imposes no affirmative duty to protect, or to render aid.  This rule, too, is subject to many exceptions, such as a parent’s duty to protect a child, contractual and statutory duties to protect, and a duty not to abandon a rescue undertaken.

Here like in privacy law, European legal codes diverge from U.S. common law with a greater willingness to impose affirmative duty.  In the United States, the affirmative-duty limitation also can relieve a corporate entity of a duty to safeguard data when the injury to the plaintiff is caused much more immediately by an intervening bad actor, such as the hacker or identity thief.  (The problem in proximate causation is integrally related.)

So on to the cases.  Remember, "[i]t takes two to make an accident."

Pennsylvania

A January 12 Pennsylvania court decision, Dittman v. UPMC (Leagle) held that an employer had no duty to safeguard employees’ private information on a workplace computer.  (Hat tip to Richard Borden at Robinson + Cole.)  University of Pittsburgh Medical Center (UPMC) employees numbering 62,000 alleged disclosure of personal information in a data breach, resulting in the theft of identities and of tax refunds.

The court applied a five-factor test for duty: 

1. the relationship between the parties;
2. the social utility of the actor's conduct;
3. the nature of the risk imposed and foreseeability of the harm incurred;
4. the consequences of imposing a duty upon the actor; and,
5. the overall public interest in the proposed solution.

UPMC prevailed in common pleas and superior courts, the latter 2-1, arguing that it owed no duty to protect the plaintiff’s interests.  On the affirmative duty question, the court pointed to attenuated causation and professed willingness to defer to the state legislature.  As summarized by Brian J.Willett for the Reed Smith Technology Law Dispatch

The Superior Court observed that the social utility of electronic information storage is high, and while harm from data breaches is foreseeable, an intervening third party stealing data is a superseding cause.

Additionally, the Court explained that a judicially created duty of care would be unnecessary to motivate employers to protect employee information, as “there are still statutes and safeguards in place to prevent employers from disclosing confidential information” in addition to business considerations.

Finally, the Court agreed with the trial court’s conclusion that creating a duty in this context would not serve the public interest; rather, it would interrupt the deliberative legislative process and expend judicial resources needlessly.

The court then bolstered its conclusion by pointing to the economic loss rule as well. 

Massachusetts

Just before the holiday break in December, a Massachusetts Appeals Court also decided a case in which the plaintiff alleged an employer’s negligence in safeguarding private data—though the plaintiff was a client of the employer rather than an employee.

The facts recited by the court in Adams v. Congress Auto Insurance Agency, Inc. (Justia), have the makings of a docudrama.  According to the court, Thomas was fleeing police at high speed when he crashed his car into Adams's.  Thomas was driving the car of his girlfriend, Burgos, so Adams claimed against Burgos’s auto insurance.  Meanwhile Burgos was both customer and customer service manager of defendant insurance agency Congress.  She reported her car stolen and filed her own insurance claim. 

Adams could identify Thomas.  So Burgos used her computer access at work to identify Adams and passed his identity to Thomas.  Thomas then phoned Adams, impersonated a state police officer, and threatened Adams: “‘Shut the F up and get your car fixed or you will have issues,’” the court purported to quote.  Though I bet Thomas didn’t say just “F.”

Adams sued Congress on multiple theories, including negligent failure to safeguard private data.  At the trial level, according to the appeals court, “the motion judge . . . rul[ed] that expert testimony was required to establish whether the agency owed a duty to Adams to safeguard his personal information, what that duty entailed, and whether the agency breached that duty.”

It’s odd that the motions judge sought expert testimony, because, as the appeals court aptly observed, duty is unique among the four elements of negligence—duty, breach, proximate cause, and injury—for being purely a question of law, guided by public policy.  Courts do not ordinarily hear expert testimony on what the law is.  The theory goes that figuring that out is the judge’s main job.  (Too bad, or being a law professor would be more lucrative.  I was gently tossed from the witness stand once when a lawyer made a valiant but futile attempt to squeeze me past the rule.)

Unlike the Pennsylvania Superior Court, the Massachusetts Appellate Court found its way to a legal duty.  The court held “that the agency had a legal duty to Adams, a member of a large but clearly defined class of third parties, to prevent its employee’s foreseeable misuse of the information that Adams provided to process his automobile insurance claim.”  Where the Pennsylvania court had pointed to statute to justify judicial restraint, the Massachusetts court pointed to state data breach law to show that the legislature had green-lighted legal duty (albeit "a single green light, minute and far away").

“Just as those with physical keys to the homes of others have a duty of reasonable care to preserve their security,” the Massachusetts court reasoned, “companies whose employees have access to the confidential data of others have a duty to take reasonable measures to protect against the misuse of that data.”  Indeed, the court cited a keys case as applicable precedent.  The court made no fuss over the rule of affirmative duty or the rule of economic loss.  In a discussion of causation, the court seemed content to resort to foreseeability on the facts.

Summary judgment for defendant Congress was vacated, and the case was remanded for trial.

Conclusion

Advocates who wish to block European-style data protection in the United States use the availability of state tort law remedies as one tool in the toolbox to argue that U.S. law already sufficiently safeguards personal data from both sides of the Atlantic.  That’s not true.  Not yet.

Data protection in the United States is confounded by the rules of affirmative duty and economic loss.  And that’s not bad; those rules exist for sound public policy reasons.  They also are excepted for sound reasons.

I’ve written before (e.g., here and here) that popular thinking and expectations with respect to individual privacy are converging in the United States and Europe, even if a legal bridge lags behind.  Common law negligence can be a vital building block of that bridge.  But it’s a work in progress.

“‘Don’t believe everything you hear, Nick.’”

No comments:

Post a Comment