Showing posts with label GDPR. Show all posts
Showing posts with label GDPR. Show all posts

Thursday, February 1, 2024

Naming rape suspects may draw criminal charges for journalists under Northern Ireland privacy law

Bernard Goldbach via Flickr CC BY 2.0
In Northern Ireland, it's a crime for a journalist to identify a rape suspect.

The relevant provision of the country's Justice (Sexual Offences and Trafficking Victims) Act 2022. Attorney Fergal McGoldrick of Carson McDowell in Belfast detailed the law for The International Forum for Responsible Media Blog in October 2023, just after the law took effect.

The law applies to a range of sexual offenses including rape. The prohibition expires upon an arrest warrant, criminal charge, or indictment. If prosecution does not expire the prohibition on identification, it remains in force until 25 years after the death of the suspect. The act amended preexisting privacy law to afford comparable anonymity to victims.

I have deep experience with this issue, and it is fraught. Despite my strong preference for transparency in government, especially in policing, the law has merit.

I was a university newspaper editor back in ye olden days of paper and ink. My newspaper reported vigorously on accusations of sexual assault against a student at our university by a student at a nearby university. The accusations and ensuing criminal investigation gripped the campus.

We learned the identity of both suspect and accuser. We reported the former and concealed the latter. Discussing the matter as an editorial board, we were uncomfortable with this disparity. Having the suspect be a member of our own community and the accuser an outsider amplified our sensitivity to a seeming inequity. We did take measures to minimize use of the suspect's name in the reporting.

These were the journalistic norms of our time. Naming the accuser was unthinkable. This was the era of "the blue dot woman," later identified as Patricia Bowman (e.g., Seattle Times). The nation was enthralled by her allegation of rape against American royalty, William Kennedy Smith. In the 1991 televised trial, Bowman, a witness in court, was clumsily concealed by a floating blue dot, the anonymizing technology of the time.

Smith was acquitted. The case was a blockbuster not only for TV news, but for journalism, raising a goldmine of legal and ethical issues around criminal justice reporting and cameras in the courtroom.

There was no anonymity for Smith. I went to a Society of Professional Journalists (SPJ) conference around this time, and the issues were discussed in a huge plenary session in a ballroom. The crowd exuded self-loathing for the trauma journalism itself had piled on Bowman. Objectivity be damned, many speakers beat the drums for the pillorying of the acquitted Smith.

The calculation in journalism ethics with regard to Smith, and thus to my editorial board, was that police accountability, knowing whom is being investigated, charged, or detained, and public security, alerting the public to a possible threat, or eliciting from the public exonerating evidence, all outweighed the risk of reputational harm that reporting might cause to the accused. Moreover, ethicists of the time reasoned, it would be paternalistic to assume that the public doesn't understand the difference between a person accused and a person convicted.

Then, in my campus case, the grand jury refused to indict. Our reporting uncovered evidence that the accusation might have been exaggerated or fabricated.

Our editorial hearts sank. Had we protected the wrong person?

My co-editor and I discussed the case countless times in the years that followed. We agonized. It pains me still today. Thirty years later, I find myself still retracing the problem, second-guessing my choices. It's like a choose-your-own-adventure where you feel like you're making the right choice each time you turn the pages, yet your steps lead you inevitably to doom.

Idealistically committed as we were at that age to freedom-of-information absolutism, we were inclined to the anti-paternalistic argument and reasoned that probably we should have named everyone from the start and let the public sort it out.

In our defense, a prior and more absolutist generation of norms in journalism ethics prevailed at the time. I was there at SPJ in the following years as leading scholars worked out a new set of norms, still around today, that accepts the reality of competing priorities and evinces more flexible guidance, such as, "minimize harm." Absolutism yielded to nuance. Meanwhile, the internet became a part of our lives, and both publication and privacy were revolutionized.

So in our present age, maybe the better rule is the Northern Ireland rule: anonymize both sides from the start. 

I recognize that there is a difference in a free society between an ethical norm, by which persons decide not to publish, and a legal norm, which institutes a prior restraint. I do find the Northern Ireland rule troublesomely draconian. The law would run headlong into the First Amendment in the United States. Certainly, I am not prepared to lend my support to the imprisonment of journalists.

Yet the problem with the leave-it-to-ethics approach is that we no longer live in a world in which mass media equate to responsible journalism. From where we sit in the internet era, immersed in the streaming media of our echo chambers, the SPJ Code of Ethics looks ever more a relic hallowed by a moribund belief system.

In Europe, the sophisticated privacy-protective regime of the General Data Protection Regulation (GDPR) is more supportive than the U.S. First Amendment of the Northern Ireland approach. The UK continues to adhere to the GDPR regime since Brexit. The GDPR reflects the recognition in European law of privacy and data protection as human rights, to be held in balance with the freedoms of speech and press. Precisely this balance was at issue in 2022, in Bloomberg LP v. ZXC, in which the UK Supreme Court concluded that Bloomberg media were obligated to consider a suspect's privacy rights before publishing even an official record naming him in a criminal investigation.

McGoldrick wrote "that since Bloomberg most media organisations have, save in exceptional circumstances, elected not to identify suspects pre-charge, thus affording editors the discretion to identify a suspect, if such identification is in the public interest."

Maybe the world isn't the worse for it.

Wednesday, May 17, 2023

Mass., EU courts wrestle with requisite harm in defamation, data protection cases

The vexing problem of proof of damages in defamation and privacy has turned up recently in the Massachusetts Court of Appeals and the Court of Justice of the European Union. Meanwhile, the Massachusetts Gaming Commission borrowed European privacy principles for new data security rules.

Tiny turkey. Stéphanie Kilgast via Flickr CC BY-NC-ND 2.0
'Stolen' Turkey Money in Massachusetts

The Appeals Court in April vacated dismissal in a business dispute over turkeys. Nonprofit and business collaborators fell out over spending on variably sized turkeys for a charitable food event. The defendant wrote on social media that the plaintiff "stole" money intended for charitable purposes.

The complaint, which was filed by a Massachusetts lawyer, was messy—narrative in excess, numbering in disarray, and allegations jumbled between liability theories—so it was difficult for the trial court to parse the pleadings. With the aid of oral argument on appeal, the court teased out the defamation count and determined that it had been dismissed for want of pleaded loss.

However, Massachusetts is among jurisdictions that continue to recognize the historical doctrines of libel per se and slander per se. Those doctrines allow some pleadings to proceed without allegation of loss, and for good reason. Reputational harm is exceedingly difficult to prove, even when it seems self-evident. After all, whom should a plaintiff call to testify to prove her damaged reputation, people who now think an awful falsity about her? Witnesses will be less than eager. Even in case of a business plaintiff that suffers economic loss, it can be exceedingly difficult to tie specific losses to specific assertions of falsity.

The historical approach allows a plaintiff to demand presumed damages. That's a messy solution, because the jury is entrusted with broad discretion to assess the damages. On the plaintiff side, perhaps that's OK; we just juries to measure intangible losses all the time, as in the case of general damages for injuries, or pain and suffering. The defense bar and allied tort reformers have rebelled against presumed damages, though, arguing that they afford juries a blank check. That unpredictability makes it difficult for defendants and insurers to assess their liability exposure. Defense-oriented tort reformers have been successful in extinguishing per se defamation actions in many U.S. states.

Massachusetts splits the difference, I think in a healthy way. Per se actions are preserved, but the plaintiff is entitled to nominal damages, plus proved actual losses, but not presumed damages. I mentioned recently that the E. Jean Carroll case has spurred overblown commentary about the potential of defamation law to redress our misinformation problem. The unavailability of per se actions in many states is one reason that defamation is not up to the job. A defamation action for nominal damages helps, though, coming about as close as U.S. jurisdictional doctrine allows to a declaration of truth—which is what defamation plaintiffs usually most want.

Allegation of a crime, such as theft or misappropriation of charitable funds, fits the class of cases that qualify for per se doctrine, whether libel or slander. There is some room debate about whether social media better fits the historical mold of libel or slander, but that's immaterial here. The allegation of "stolen" money fit the bill.

The Appeals Court thus vacated dismissal and remanded the claim for defamation and related statutory tort. The court clerk entered the Memorandum and Order for Judges Mary Thomas Sullivan, Peter Sacks, and Joseph M. Ditkoff in Depena v. Valdez, No. 22-P-659 (Mass. App. Ct. Apr. 28, 2023).

Austrian post box.
High Contrast via Wikimedia Commons CC BY 3.0 DE

Non-Consensual Political Analysis in Austria

The Court of Justice of the European Union (CJEU) also recently tussled with a problem of proof of damages. The court held early in May that a claimant under the EU General Data Protection Regulation (GDPR) must claim harm for a personal data processing violation, but need not meet any threshold of seriousness.

The court's press release summarized the facts in the case:

From 2017, Österreichische Post collected information on the political affinities of the Austrian population. Using an algorithm, it defined "target group addresses" according to socio-demographic criteria. The data thus collected enabled Österreichische Post to establish that a given citizen had a high degree of affinity with a certain Austrian political party. However, that data processed were not communicated to third parties.

The citizen in question, who had not consented to the processing of his personal data, claimed that he felt great upset, a loss of confidence and a feeling of exposure due to the fact that a particular affinity had been established between him and the party in question. It is in the context of compensation for the non-material damage which he claims to have suffered that he is seeking before the Austrian courts payment of the sum of €1,000.

The plaintiff endeavored to quantify his emotional upset, but in the absence of communication of the conclusions about the plaintiff to to any third party, the claim of harm was thin. Emotional suffering resulting from the mere processing of personal data in contravention of one's advance permissions seems minimal. Accordingly, the Austrian courts, following the example of neighboring Germany, were inclined to disallow the plaintiff's action for failure to demonstrate harm.

Harm has been a sticking point in privacy law in the United States, too. Privacy torts are a relatively modern development in common law, and they don't import the per se notion of historical defamation doctrine. Tort law balances culpability with harm to patrol the borders of social contract. Thus, intentional battery is actionable upon mere unwanted touching, while merely accidental infliction of harm requires some degree of significance of injury. Defamation law arguably defies that dynamic, especially in per se doctrine, in part for the reasons I explained above, and in part because, for much of human history, personal integrity has been as essential for survival as physical security.

Not having inherited the paradigm-defying dynamic, privacy law has posed a puzzle. Scholars disagree whether damages in privacy should follow the example of business torts, requiring at least economic loss; the example of emotional distress torts, requiring at some threshold of severity; or defamation per se torts, recognizing some sui generis harm in the disruption of personal integrity. As personal data protection has grown into its own human right independent of privacy, the problem has been amplified, because, exactly as in the Austrian case, a right against the non-consensual processing of data that are personal, but not intimately personal, is even more difficult to generalize and quantify.

The problem is not only a European one. In the United States, courts and scholars have disagreed over when claims in the burgeoning wave of state data protection laws, such as the Illinois Biometric Information Privacy Act, can satisfy the "case or controversy" constitutional requirement of jurisdiction. Failure to see a sui generis harm in privacy violations means, arguably, that there is no "case or controversy" over which courts, particularly federal courts, have competence.

The CJEU balked at Austrian courts' unwillingness to see any wrong upon a claim of only intangible loss. But the court agreed that the plaintiff must demonstrate harm. Hewing to the text of the GDPR, the court reasoned that a plaintiff must show a violation of the regulation, a resulting harm, and a causal connection between the two. Thus, harm is required, but there is no requirement that the harm meet some threshold of seriousness or economic measure.

The CJEU decision was touted in headlines as "clarifying" the law of damages under the GDPR, while the stories beneath the headlines tended to do anything but. Some writers said that the court raised the bar for GDPR claims, and others said the court lowered it. Confusion stems from the fact that the court's decision spawns subsequent many questions. Conventionally, the GDPR leaves the quantum of damages to national courts. So how must a claim of de minimis harm be measured on remand? Are nominal damages sufficient compensation, or must the data protection right be quantified?

Moreover, Sara Khalil, an attorney with Schönherr in Vienna, observed that the court left out a component of tort liability that national courts sometimes require: culpability. Is there a minimal fault standard associated with recovery for mere data processing? Because tort law ties together the elements of harm and fault, at least in some jurisdictions, the one question necessarily begets the other.

RW v. Österreichische Post AG, No. C-154/21 (May 4, 2023), was decided in the First Chamber of the CJEU.

Data Security in Gambling in Massachusetts

Policymakers and courts on both sides of the Atlantic are wrestling with the problems of contemporary personal data protection. And while the gap between the GDPR and patchwork state and federal regulation in the United States has stressed international relations and commerce, it's no wonder that we see convergence in systems trying to solve the same problems.

To wit, the Massachusetts Gaming Commission has employed recognizably European privacy principles in new data security rules. For Israeli law firm Herzog Fox & Neeman, attorneys Ariel Yosefi, Ido Manor, and Kevin David Gampel described the overlap. The commission adopted the regulations for emergency effect in December 2022; final rules were published in April.

The attorneys detailed the requirements of gambling operators:

  • to establish and plainly disclose to players comprehensive data privacy policies, including measures regarding data collection, storage, processing, security, and disclosure, the latter including the specific identities of third-party recipients; 
  • to guarantee player rights including access, correction, objection, withdrawal of consent, portability, and complaint;
  • to eschew purely automated decision-making; and
  • to implement physical, technical, and organization security practices.

The regulations are 205 CMR 138 and 205 CMR 248 (eff. Mar. 9, 2023, publ. Apr. 28, 2023).

Saturday, April 22, 2023

Lissens presents EU data protection, IoT research

Sylvia Lissens, a Ph.D. student and teaching assistant at the KU Leuven Centre for Global Governance Studies in Belgium, presented part of her doctoral research comparing U.S. and EU data protection law at a doctoral seminar in Lyon, France, in December.

In her research, Lissens focuses on the internet of things (IoT) to examine how American and European law protects the personal data that machines increasingly collect. She has a law degree from KU Leuven and a background in criminology, so is especially interested in government access to personal data, which has been a sticking point in trans-Atlantic privacy negotiations.

Looking at the emerging norms in state legislation in the United States, on the one hand, and at developing data protection jurisprudence in the European Union, on the other hand, Lissens hopes to identify points of convergence and divergence that might smooth the way forward for agreement over data flows.

In Lyon, Lissens presented findings from the EU leg of her research at the International Doctoral Seminar in European and International Human Rights Law, hosted by the Université Jean Moulin Lyon 3. She explained how the broad range of data collected by devices in our homes, from phones to refrigerators, will confront national security and international trade regimes with new challenges in the protection of personal privacy.

Comparative law is among Lissens's teaching responsibilities at KU Leuven. She joined my Comparative Law class by Zoom this semester to provide an EU perspective on contemporary European legal issues. Students' experience was greatly enriched by both her experience as a professional and her informed perspectives as a Belgian voter. I'm privileged to serve on Lissens's dissertation committee.

Wednesday, November 23, 2022

Anti-corruption law violates business-owner privacy, EU court holds with myopic appraisal of transparency

A key European Union transparency law that allows watchdogs to trace corporate ownership to combat corruption has been struck down by the EU high court for compromising personal privacy.
EU beneficial owner registry map from Transparency International, 2021. Read more.
CC BY-NC-ND 3.0

I'm not a hard skeptic on the personal privacy prerogative of the EU General Data Protection Regulation. To the contrary, I've written that there's a lot to like about the emerging global privacy norms embodied in the GDPR, and, contrary to conventional wisdom, American social expectations, if not yet federal law, are converging with Europe's.

That said, the EU Court of Justice yesterday announced a profoundly problematic decision at the junction of public access and personal privacy. The blanket disclosure requirements of a key anti-money-laundering law can't stand, the court held, because they don't calibrate the public need for access with the privacy of natural-person business owners with sufficient precision, that is, as a function of necessity and proportionality.

Troublingly, the court characterized transparency norms, which are grounded in treaty and law more firmly in the EU than in the United States, as specially relevant to the public sector and not fully implicated in the private sector, in the context of business regulation.

The potential implication of this proposition is that access to information is limited to a requester learning "what the government is up to," to the exclusion of government oversight of the private sector. That's a cramped and problematic construction of access law that has dogged journalists and NGOs using the U.S. Freedom of Information Act (FOIA) for decades. Read more in Martin Halstuk and Charles Davis's classic 2002 treatment. As I have written in my comparative research on access to information, access to and accountability of the private sector is a problem of our times. We must solve it if we're to save ourselves from the maw of corporatocracy.

In my opinion, the CJEU decision fundamentally misunderstands and overstates the legitimate scope of data protection regulation with the effect of enervating transparency as a vital oversight tool. The impact is ironic, considering that data protection regulation came about as a bulwark to protect the public from private power. The court turned that logic on its head by using personal privacy to shield commercial actors from public scrutiny.

Unfortunately (for this purpose), I have my hands full in Europe (coincidentally) right now, and I lack time to write more. Fortunately, Helen Darbishire and the team at Access Info Europe already have written a superb summary. Their lede:

In a ruling that has sent shockwaves through Europe’s anti-corruption and transparency community, the Court found that the Fifth Anti-Money Laundering Directive (AMLD5, 2018) is too loosely framed and provides for overly-wide public access to the [ownership] registers without a proper justification of the necessity and proportionality of the interference with the rights to privacy and personal data protection of the beneficial owners.

A saving grace, Access Info observed, is that the court did not rule out transparency per se; rather, requesters will have to fight for access case by case on the facts, upon a properly narrowed regulation. In U.S. constitutional terms, it's like saying the one-size-fits-all law was struck for vagueness, but the regulatory objective still can be achieved under a narrower rule that works as applied. All the same, journalists and non-profit watchdogs are not famously well financed to fight for access on a case-by-case basis.

The case is No. C‑37/20 & No. C‑601/20 in the Grand Chamber of the CJEU.

Saturday, January 1, 2022

Code might inevitably regulate journalism in digital age

The U.K. Information Commissioner's Office is working on a "journalism code of practice" to legislate against defamation and invasion of privacy by mass media.

Principally and ostensibly, the code is intended to bring media law into conformity with U.K. data protection law, essentially the European General Data Protection Regulation (GDPR), including the stories "right to be forgotten," or right to erasure (RTBF). On the ground, the picture is more complicated. The British phone hacking scandal and following Leveson Inquiry constitute a strong causal thread in public receptiveness to media regulation.

Cambridge legal scholar David Erdos analyzed the draft code for the INFORRM public in part one and part two postings in October.  The code incorporates media torts such as defamation of privacy and misuse of private information (MOPI), the latter a common law innovation of British courts to facilitate enforcement of data protection rights. I have posited in other venues that common law tort similarly might provide a way forward to fill gaps in information privacy law in the United States.

Journalism and data protection rights have been on a collision course for a quarter century, like a slow-motion car wreck, and the draft journalism code is a harbinger of the long anticipated impact.  Back in 1995, when the EU GDPR-predecessor Data Protection Directive was brand new, the renowned media law scholar Jane Kirtley published an article in the Iowa Law Review, "The EU Data Protection Directive and the First Amendment: Why a 'Press Exemption' Won't Work."  Kirtley foresaw data protection and the First Amendment's arguably irreconcilable differences before most U.S. scholars had even heard of data protection.

In those innocent days, journalism ethics was reshaping itself to preserve professionalism in the newly realized and anxiety-inducing 24/7 news cycle.  A key plank in the new-ethics platform was its essentiality to resist regulation.  In 2000, media law attorney Bruce Sanford published the book Don't Shoot the Messenger: How Our Growing Hatred of the Media Threatens Free Speech for All of Us.  Then in 2001, everything changed, and mass media and their consumers became engrossed by new concerns over government accountability.

In a way, the consolidation of media regulation in a generation of code could be a relief for journalism, especially on the European continent.  In an age of ever more complex regulatory mechanisms, codification can offer bright lines and safe harbors to guard against legal jeopardy.  Information service providers from local newspapers to transnationals such as Google are struggling to comply with new legal norms such as the RTBF, and there is as yet little evidence of uniformity of norms, much less convergence. Yet even if industry ultimately embraces the security of code, what's good for business is not necessarily good for wide-ranging freedom of expression. 

Courts, too, are struggling with novel problems.  For example, in late November, the European Court of Human Rights ruled in Biancardi v. Italy that RTBF de-indexing orders extend beyond search engines and bind original news publishers.  Writing for Italian Tech and INFORRM, attorney Andrea Monti fairly fretted that the decision effectively compels journalistic organizations to expend resources in constant review of their archives, else face liability in data protection law.  The result, Monti reasoned, will be to discourage preservation, manifesting a threat to the very existence of historical record.

On the one hand, it's foolish to wring one's hands for fear that journalism is being newly subordinated to legal regulation.  Tort itself is a regulatory mechanism, and defamation has been around for a long time, notwithstanding the seeming absolutism of the First Amendment.  On the other hand, media regulation by law looks nothing like the punctilious supervision of regulated industries, including the practice of law.

In my own education, I found the contrast in approaches to ethics perplexing.  In journalism school, my ethics class had been taught aptly by a religion scholar who led impassioned discussions about handout hypotheticals.  In law school, the textbook in legal profession hit the desk with a thud for what was as much a study of model or uniform code as was crim or sales.

With no "First Amendment" per se, media regulation by code is not the novelty in the U.K. that it would be in the United States.  Still, with privacy and digital rights sweeping the globe, law is poised to regulate journalism in new ways everywhere, whether through the subtlety of common law or the coercive power of civil regulation.  American courts will not be able to escape their role in reshaping fundamental rights for the digital world, as European courts are at work doing now.  Kirtley foresaw the issues in 1995, and the chickens are slowly but surely turning up at the roost.

The present ICO consultation closes on January 10, 2022.

Wednesday, October 27, 2021

In parting meditation on pub gossip, Czech judge peels onion on privacy limits, judicial transparency

Does GDPR pertain to pub buzz?, AG Bobek asks.
Earlier this month, Czech judge and legal scholar Michal Bobek rounded out a six-year term as an Advocate General (AG) of the European Court of Justice with a mind-bending meditation on the ultimate futility of enforcing data protection law as written and a confirmation of the essentiality of transparency in the courts.

The case on which Bobek opined hardly required a deep dive.  He said so: "This case is like an onion," he wrote.  "I believe that it would be possible, and in the context of the present case entirely justified, to remain at that outer layer.   No peeling of onions unless expressly asked for."

But the case provided Bobek an optimal diving board, and, on the penultimate day of his term as AG, he plunged and peeled.

Complainants in the case were litigants before the Dutch Council of State (Raad van State).  They asserted that disclosure to a journalist of summary case information, from which they could be identified and details of their personal lives worked out, violated their right of privacy under the General Data Protection Regulation (GDPR) of the European Union, as transposed into Dutch law.

The disclosures are permissible under a GDPR exemption for judicial activities, Bobek concluded.  But en route to that conclusion, he further opined that the potentially unbridled scope of the GDPR must be tamed to accord with social norms and democratic imperatives.

With remarkably plain reasoning, he framed the problem in a comfortable venue:

If I go to a pub one evening, and I share with four of my friends around the table in a public place (thus unlikely to satisfy the private or household activity exception of ... the GDPR) a rather unflattering remark about my neighbour that contains his personal data, which I just received by email (thus by automated means and/or is part of my filing system), do I become the controller of those data, and do all the (rather heavy) obligations of the GDPR suddenly become applicable to me? Since my neighbour never provided consent to that processing (disclosure by transmission), and since gossip is unlikely ever to feature amongst the legitimate grounds listed in ... the GDPR, I am bound to breach a number of provisions of the GDPR by that disclosure, including most rights of the data subject contained in Chapter III.

The pub might not be the only place where the GDPR runs up against a rule of reason.  Consider the more nuanced problem of footballers considering a challenge against the processing of their performance stats.  Goodness; the pub convo will turn inevitably to football.

Let's step back for a second and take stock of the GDPR from the perspective of the American street.

Americans don't get many wins anymore.  We just retreated from a chaotic Afghanistan, despite our fabulously expensive military.  We resist socialized healthcare, but we make cancer patients finance their treatments on Go Fund Me.  We force families into lifelong debt to pay for education, undermining the social mobility it's supposed to provide.  We afford workers zero vacation days and look the other way from the exploitation of gig labor.  Our men's soccer team failed to qualify for the last World Cup and Olympics, while we're not sure why our women are rock stars; it can't be because we pay them fairly.  When it comes to personal privacy, we tend to want it, but our elected representatives seem eager to cede it to our corporate overlords.

Truth be confessed, then, Americans are willing to engage in a smidge of schadenfreude when Europeans—with their peace, their healthcare, their cheap college, their Ryanair Mediterranean vacations, their world-class football, and their g—d— G—D—P—R—get themselves tied up in regulatory knots over something like the sufficient size of a banana.  Ha.  Ha.

Therein lies the appeal, to me, of Judge Bobek's train of thought.  He finds inevitable the conclusion that posting case information is data processing within the purview of the GDPR.  The parties did not even dispute that.  For today, Bobek found an out through the GDPR exemption for the business of the courts in their "judicial capacity."

The out required a stretch to accommodate posting information for journalists, which is not, most strictly speaking, a judicial capacity.  Bobek reasoned by syllogism:  For the courts to do what they do, to act in the judicial capacity, they require judicial independence.  Judicial independence is maintained by ensuring public confidence in the judiciary.  Public confidence in the judiciary is bolstered by transparency in the courts.  Transparency in the courts is facilitated by the provision of case information to journalists.  Therefore, the judicial capacity requires publication of case information to journalists.

The problem, tomorrow, is that there is no answer in the case of pub gossip.  Bobek meditated on the human condition: "Humans are social creatures.  Most of our interactions involve the sharing of some sort of information, often at times with other humans. Should any and virtually every exchange of such information be subject to the GDPR?"

Bobek
Can't be, he concluded.

[I]n my view, I suspect that either the Court, or for that matter the EU legislature, might be obliged to revisit the scope of the GDPR one day. The current approach is gradually transforming the GDPR into one of the most de facto disregarded legislative frameworks under EU law. That state of affairs is not necessarily intentional. It is rather the natural by-product of the GDPR's application overreach, which in turn leads to a number of individuals being simply in blissful ignorance of the fact that their activities are also subject to the GDPR. While it might certainly be possible that such protection of personal data is still able to "serve mankind," I am quite confident that being ignored as a result of being unreasonable does not in fact serve well or even contribute to the authority or legitimacy of any law, including the GDPR.

While we await reassessment of the bounds of data privacy law in modern society, Bobek opined more and mightily on the importance of judicial transparency as a countervailing norm.  He opened the opinion with philosopher-jurist Jeremy Bentham:

"Publicity is the very soul of justice. It is the keenest spur to exertion, and the surest of all guards against impropriety.… It is through publicity alone that justice becomes the mother of security. By publicity, the temple of justice is converted into a school of the first order, where the most important branches of morality are enforced...."

Bobek later picked up the theme:

Judging means individualised detail brought to the public forum....

On the one hand, the basis for judicial legitimacy in an individual case are its facts and details. The judge settles an individual case. His or her job is not to draft abstract, general, and anonymous rules detached from individual facts and situations. That is the job of a legislature. The more a judicial decision departs from or hides the factual background to a public court case, or if it is later reported with significant limitations, the more often it becomes incomprehensible, and the less legitimate it becomes as a judicial decision.

On the other hand, ever since the Roman age, but presumably already earlier, if a claimant asked for the help of the community or later the State to have his claim upheld and enforced by the State, he was obliged to step into the public forum and let his case be heard there. In classical Roman times, the applicant was even entitled to use violence against the respondent who refused to appear in the public (the North Eastern part of the Roman Forum called comitium), before the magistrate (seated on a rolling chair on a tribune higher than the general public—hence indeed tribunal), when called before a court (in ius vocatione).

It is true that, later on, there were other visions of the proper administration of justice and its publicity. They are perhaps best captured by a quote from a judge in the Parlement de Paris writing in 1336 instructions to his junior colleagues, and explaining why they should never disclose either the facts found or the grounds for their decision: "For it is not good that anyone be able to judge concerning the contents of a decree or say 'it is similar or not'; but garrulous strangers should be left in the dark and their mouths closed, so that prejudice should not be caused to others.... For no one should know the secrets of the highest court, which has no superior except God...."

In the modern age, returning to the opening quote of Jeremy Bentham, it is again believed that even garrulous strangers should be allowed to see and understand justice. Certainly, with the arrival of modern technologies, a number of issues must continuously be re-evaluated so that garrulous strangers cannot cause prejudice to others....

Naturally, the publicity of justice is not absolute. There are well-grounded and necessary exceptions. The simple point to keep in mind here is: what is the rule and what is the exception. Publicity and openness must remain the rule, to which naturally exceptions are possible and sometimes necessary. However, unless the GDPR were to be understood as imposing a revival of the best practices of the Parlement de Paris of the 14th century, or other elements of the Ancien Régime or the Star Chamber(s) for that matter, it is rather difficult to explain why, in the name of the protection of personal data, that relationship must now be reversed: secrecy and anonymity were to become the rule, to which openness could perhaps occasionally become the welcome exception.

Bobek seems content with judicial exceptionalism in the GDPR framework.  I'm not so sure.  I rather think the problem of the courts points to the broader problem of GDPR scope.  Will there ultimately be a pub exception, too?  Stubborn American insistence on framing data protection as business regulation, as in California data protection law, suddenly exhibits some appeal.

The case is X v. Autoriteit Persoonsgegevens, No. C-245/20, Opinion of Advocate General Bobek (Oct. 6, 2021).  HT @ Edward Machin, writing in London for Ropes & Gray.

This is not Bobek's first high-profile opinion on the GDPR—even this year.  Read in Fortune about his January opinion in a Facebook case.

Friday, June 4, 2021

First Amendment advocate counsels caution, but doesn't rebuff, American right to be forgotten

Gene Policinski, Freedom Forum Senior Fellow for the First Amendment, published an op-ed last week for the "First Five" blog in which he counseled caution, but did not gainsay, newsroom "fresh start," or "right to be forgotten" (RTBF), programs.

Motivated in part by European notions of personal data protection, or informational privacy, especially RTBF, fresh start programs give persons covered in past news an opportunity to apply for the erasure of their coverage from online archives.  For NPR in February, David Folkenflik and Claire Miller reported on trending fresh start programs at major U.S. news outlets, such as The Boston Globe, "Revisiting the Past for a Better Future."  The NPR stories observed that these programs have come about in part because of European legal norms, even for newspapers beyond the reach of European legal jurisdiction.

In 2013, I wrote in a law review article that Americans' expectations of privacy, including RTBF, are in fact consonant with evolving European norms, but American law has been slow to keep pace.  The twin notions of finite punishment for past wrongs and of a second chance for persons who have paid their dues are quintessentially American, I wrote in a Washington Post op-ed in 2014.  Those values are reflected, for example, in Eighth Amendment jurisprudence and the Ban the Box campaign.

A prohibitive challenge to RTBF norms in the United States has been the First Amendment, which generally prohibits regulation of the republication of lawfully obtained and truthful information.  Sometimes for better and sometimes for worse, the free-speech absolutist bent of the First Amendment contrasts with a more flexible European approach to rights balancing.  Nothing about the First Amendment, however, precludes a private journalistic enterprise, such as the Globe, from erasing content voluntarily.

Like RTBF itself, fresh start programs have been criticized by free speech and mass communication scholars.  They remind us that journalism is the "first rough draft of history."  Tinkering with archives therefore vests private actors with a weighty, not to mention expensive, responsibility on behalf of the public.  Fresh start advocates point out that this work is not dissimilar to the exercise of news judgment in the first instance.  But the perspective problem is not eliminated by time.  There is no way to be sure that our present-day second-guessing of the historical record is more fair and objective than the original judgment, nor sufficiently preservationist for the future.

Old Slave Mart Museum, Charleston, S.C.
(RJ Peltz-Steele CC BY-NC-SA 4.0)
Just last week, I visited the Old Slave Mart Museum and other historical sites in Charleston, S.C.  To my eyes, the casual treatment of persons as property in the content of news media in times of slavery, as well as racism evident in later media during Jim Crow, is evidence of horrific injustice and a powerful reminder not to take for granted that one's present vision is free of bias.  What if that record had been erased, rather than preserved?  Could Henry Louis Gates Jr.'s "Finding Your Roots" have identified Ben Affleck's slave-owning ancestor (NPR) if history were redacted?

At the same time, I am an advocate for RTBF in some form, just as I support Ban the Box.  I am devoted to the First Amendment.  But digital media, that is, an internet that "never forgets," confronts our society with a new and qualitatively different challenge from any we have faced before.  Viktor Mayer-Schönberger well described in his 2011 book, Delete: The Virtue of Forgetting in the Digital Age, how forgetting, in addition to remembering, is an essential and well evolved part of human social culture.  A failure to forget is an existential threat.

Journalist and academic Deborah L. Dwyer has developed a useful and thought-provoking set of fresh start resources for journalists at her website, Unpublishing the News, cited by Policisnki.  I don't pretend to know whether fresh start, or European RTBF, or some other approach is the best solution, nor whether any of these models will stand the test of time.  I do believe that feeling our way forward is fascinating and necessary.

The op-ed is Gene Policinski, Perspective: News Outlets Need Caution in Offering a "Fresh Start," Freedom Forum (May 26, 2021).

Monday, February 8, 2021

UK court: Long arm of GDPR can't reach California*

Image my composite of Atlantic Ocean by Tentotwo CC BY-SA 3.0
and "hand reach" from Pixabay by ArtsyBee, licensed

*[UPDATE, Jan. 30, 2022:] On December 21, 2021, the Court of Appeal allowed service on U.S. defendants without ultimately resolving the GDPR territorial scope question.  Read more from Paul Kavanaugh, Dylan Balbirnie, and Madeleine White at Dechert LLP.]

A High Court ruling in England limited the long-arm reach of European (now British) privacy law in a suite of tort claims against Forensic News, a California-based web enterprise doing "modern investigative journalism."

The complainant is a security consultant investigated by Forensic News and a witness in the U.S. Senate Intelligence Committee probe into Russian interference in the 2016 U.S. elections.  A British national, he accused Forensic News of "malicious falsehood, libel, harassment and misuse of private information," the latter based on violation of the British enactment of the European General Data Protection Regulation (GDPR).

The extraterritorial reach of the GDPR has been a hot topic lately in privacy law circles, as U.S. companies struggle to comply simultaneously with foreign and burgeoning state privacy laws, such as the California Consumer Privacy Act (CCPA).  

Forensic News has no people or assets in the UK, but the complainant tried to ground GDPR application in the news organization's website, which accepts donations in, and sells merch for, pounds and euros.  No dice, said the court; it's journalism that links Forensic to the plaintiff and to the UK, not the mail-order side show.

The case is Soriano v. Forensic News LLC, [2021] EWHC 56 (QB) (Jan. 15, 2021).  Haim Ravia, Dotan Hammer, and Adi Shoval at Pearl Cohen have commentary.

Monday, October 5, 2020

U.S. White Paper on 'Schrems II': Emperor still clothed

A new U.S. white paper on data protection means favorably to supplement the record on U.S. surveillance practices that, in part, fueled the European Court of Justice (ECJ) decision in "Schrems II," in July, rejecting the adequacy of the Privacy Shield Framework to secure EU-to-US data transfers.

From the U.S. Department of Commerce, Department of Justice, and Office of the Director of National Intelligence, the white paper suggests that the ECJ ruling was interim in nature, pending investigation of U.S. national security practices to better understand whether they comport with EU General Data Protection Regulation norms, such as data minimization, which means collecting only data necessary to the legitimate purpose at hand.  The paper states:

A wide range of information about privacy protections in current U.S. law and practice relating to government access to data for national security purposes is publicly available.  The United States government has prepared this White Paper to provide a detailed discussion of that information, focusing in particular on the issues that appear to have concerned the ECJ in Schrems II, for consideration by companies transferring personal data from the EU to the United States. The White Paper provides an up-to-date and contextualized discussion of this complex area of U.S. law and practice, as well as citations to source documents providing additional relevant information. It also provides some initial observations concerning the relevance of this area of U.S. law and practice that may bear on many companies’ analyses. The White Paper is not intended to provide companies guidance about EU law or what positions to take before European courts or regulators. 

Armed with this additional information, then, the message to the private sector seems to be, Keep Calm and Carry On, using the very same "standard contractual clauses" (SCCs) that the ECJ invalidated.  Yet if the information featured in the white paper has been publicly available, why assume that the ECJ was ill informed?  (Read more about SCC revisions under way, and their likely shortcomings, at IAPP.)

Unfortunately for the U.S. position, the ECJ opinion was not, to my reading, in any way temporary, or malleable, pending further development of the record.  The white paper comes off as another installment in the now quarter-century-old U.S. policy that the emperor is fully clothed.

I hope this white paper is only a stop-gap.  As I said in a Boston Bar CLE recently, no privacy bill now pending in Congress will bridge the divide between the continents on the subject of U.S. security surveillance.  A political negotiation, which might involve some give from the American side at least in transparency, seems now to be our only way forward.

The white paper is Information on U.S. Privacy Safeguards Relevant to SCCs and Other EU Legal Bases for EU-U.S. Data Transfers after Schrems II (Sept. 2020).

Friday, September 25, 2020

Boston Bar panel surveys landscape of privacy law, data protection policy, class action litigation

Attorneys Melanie Conroy, Marjan Hajibandeh, and Matthew M.K. Stein
We had great fun yesterday, as lawyer fun goes, talking about privacy law in the United States, from the impact of the Privacy Shield collapse to the latest litigation under California's groundbreaking consumer privacy protection law.  I was privileged to appear in a Boston Bar Association program on privacy class action litigation, led by attorney Melanie A. Conroy, CIPP/US, of Pierce Atwood LLP, alongside practicing-attorney panelists Matthew M.K. Stein, of Manatt, Phelps & Phillips, LLP, and Marjan Hajibandeh of CarGurus, Inc. 

Our topical reach was a breathless sprint across a dramatic landscape.  We opened with our respective thoughts on developments in privacy law, Conroy observing that the fast-paced field has undergone seismic shifts again and again in recent years, from the implementation of the California Consumer Privacy Act (CCPA) to the $18m Equifax data breach settlement in Massachusetts.

I spoke to the impact of the European Court of Justice decision ("Schrems II" (ECJ July 16, 2020)) invalidating the U.S.-EU Privacy Shield as a motivator for U.S. reform.  Besides the significance of the case in Europe and our foreign relations, the decision signals that a quarter century after adoption of the first European Data Protection Directive, Europe's patience with American recalcitrance has finally run out.

Julie Brill (MS CC) and William Kovacic
Former Federal Trade Commissioner Julie Brill told the Senate Commerce Committee this week that in two years, 65% of the world will be living under data protection laws, most of them modeled after the EU General Data Protection Regulation (GDPR).  As former Federal Trade Commission (FTC) Chairman William Kovacic put it, if we don't pass legislation in the United States, "we will get a national privacy policy: the GDPR."  As I tweeted this week, hearing testimony drove the usually cool and collected Senator Maria Cantwell (D-Wash.) to exclaim, "My God, this is clear, we need a strong privacy law." And Americans are ready; Brill said that nine out of ten Americans now believe that privacy is a human right.

Sen. Cantwell
Our panel ran down the latest developments in class action privacy litigation, loosely divided on the fronts of biometric data class actions, mostly arising under Illinois's pioneering Biometric Information Privacy Act; CCPA-related class actions in California; and data breach litigation.  I ran down cases in the latter vein and talked some about the present circuit split over Article III standing.  Federal courts have divided over whether "theft alone" can constitute concrete injury for constitutionally minimal standing, or plaintiffs must show some subsequent misuse of their data.  This issue is not limited to the data breach area, but has implications across a wide range of statutory enforcement systems, including the Fair Credit Reporting Act.

For my part, I predict that our dawning, if belated, understanding of the monetary value of personally identifiable information (PII) will lead us to the inevitable conclusion that theft alone suffices.  This is evidenced, for example, in Hogan v. NBCUniversal (D.R.I. filed Aug. 27, 2020), over the sale of Golf Channel subscriber identities, which subsequently were associated with other PII and resold.  Though for the time being, my favored conclusion is arguably not the inclination evidenced in the U.S. Supreme Court in Spokeo, Inc. v. Robins, in 2016.  Senator Dick Blumenthal (D.-Conn.) mentioned this week, apropos of current events, that Justice Ginbsburg, joined by Justice Sotomayor, dissented in Spokeo on just this point.

The late Justice Ginsburg; Sen. Blumenthal
Our next panel focus was developments in the First Circuit and Massachusetts.  In Massachusetts Superior Court in Boston, data breach litigation, filed in May 2019, against Massachusetts General Hospital, Brigham & Women's Hospital, and the Dana-Farber Cancer Institute, over online patient-service communications occurring outside secure portals, raises the very question of concrete harm, which may be resolved differently at the state level than under the federal Constitution.  Meanwhile in federal court, the same issue in data breach litigation, filed in March 2020, in Hartigan v. Macy's, highlights the lack of First Circuit precedent on the question since Spokeo, while citing strong pre-Spokeo indications that the First Circuit would favor the misuse-required position.

In parting observations, I offered that we have a long road ahead.  Of all the bills pending in Congress (see EPIC's excellent April report), only some propose a private cause of action and none attacks the problem of government surveillance, both purported prerequisites to European restoration of authorized trans-Atlantic data flow.  Within the U.S Congress, there appears to be bipartisan support for some kind of nationwide privacy legislation.  But the questions of private or FTC enforcement, and whether preemption would mean a legislative floor or ceiling remain sticking points that could derail the process.

Saturday, September 28, 2019

EU court rules for Google, narrows French 'right to be forgotten' order to Europe

In the latest battle of the feud between Google and the French data protection authority (CNIL), the Court of Justice of the European Union ruled that the CNIL's "right to be forgotten" order should be limited to internet users in Europe.  However, the court did not rule out the possibility of a worldwide order if the facts warrant.

The court wrote:

[T]he right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality....  Furthermore, the balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world. 

While the EU legislature has, in Article 17(3)(a) of Regulation 2016/679 [GDPR], struck a balance between that right and that freedom so far as the Union is concerned ... it must be found that, by contrast, it has not, to date, struck such a balance as regards the scope of a de-referencing outside the Union.

"Proportionality" is a core principle of EU human rights law when regulation collides with individual rights, or, as here, state power is implicated to favor one individual's rights over those of others.  The same principle also constrains supra-national authority over member states.

The case arose from a CNIL fine of Google.  The French authority had ordered Google to de-list search results to protect certain individuals' privacy under the "right to be forgotten," or "right to erasure," when those individuals were searched by name.  "De-listing" or "de-referencing" search results is the front line of right-to-erasure court challenges today, though the specter of erasure orders that reach content providers directly looms on the horizon.

Google complied with the CNIL order only for European domains, such as "google.fr" for France, and not across Google domains worldwide.  Google employs geo-blocking to prevent European users from subverting de-listing simply by searching at "google.com" (United States) or "google.com.br" (Brazil).  Determined users still can beat geo-blocking with sly technocraft, so CNIL was dissatisfied with the efficacy of Google's solution.  Undoubtedly, a dispute will arise yet in which the CNIL or another European data protection authority tests its might with a more persuasive case for global de-listing.

The case is Google, LLC v. Commission Nationale de L’informatique et des Libertés (CNIL), No. C-507/17 (E.C.J.), Sept. 24, 2019.  Several free speech and digital rights NGOs intervened on behalf of Google, including Article 19, the Internet Freedom Foundation, the Reporters Committee for Freedom of the Press, and the Wikimedia Foundation, as well as Microsoft Corp.  The case arose initially under the 1995 EU Data Protection Directive, but carries over to the new regime of the General Data Protection Regulation (GDPR).

Monday, September 23, 2019

EU frets over Privacy Shield adequacy, and NGO insists, emperor still naked

The Commission of the European Union is reviewing the U.S.-EU Privacy Shield framework for conformity with the General Data Protection Regulation (GDPR), and NGO AccessNow is again demanding an inadequacy finding.

A lot is at stake.  For the uninitiated, European regulators have a dramatically different take on the protection of personal information than the free-wheeling free marketeers of the United States.  I've written some about the problem here and elsewhere (e.g., here and here), arguing that the American people are not so far from European privacy norms, but it's our law that lags behind the democratic will.  For my money, the definitive macro analysis of why American and European approaches to privacy have differed is James Q. Whitman's.  Anyway, the GDPR does not allow the export from Europe of information to countries that do not comport with its privacy protections, and that creates a monumental problem for the trans-Atlantic flow of not only information, but commerce.

The problem is not new and existed under the GDPR's predecessor law, the 1995 Data Protection Directive (DPD).  A number of mechanisms were devised to work around the problem, and they were approved by European regulators under the umbrella of "the Safe Harbor agreement."  But it's widely understood, at least on the European side, that Safe Harbor was something of a sham: No one with a straight face could argue that U.S. law was comparable to the DPD.  Safe Harbor in practice comprised mostly industry standards, voluntarily adopted and barely enforced by U.S. regulators.  There's also an important piece of this problem in the vein of national security, government spying, and personal information; I'm not even getting into that.

Privacy Shield is stronger than Safe Harbor, but the GDPR is a lot stronger than the DPD.  There have been remarkable advancements in privacy law in some states, notably California, in the EU direction.  And quite a number of court challenges have followed, winding their way through the process, some derived from objections in the commercial sphere, some the civil rights sphere: you've probably heard of "the right to be forgotten."  But our patchwork state laboratories hardly sum reassurance to Europe.  So in the absence of a comprehensive peace offering at the federal level, the debate over the EU's adequacy determination regarding Privacy Shield pretty much boils down to whether or not we're going to admit that the emperor is naked.

AccessNow, a global NGO and sponsor of RightsCon, has consistently called for honesty about the emperor's sorry state.  A recent memo calls on the Commission to rule Privacy Shield inadequate, and AccessNow has invited republication of a new infographic in support of its position.  I hereby oblige. It's past time we get serious about protecting personal information in the United States and stop commercial exploitation of human identity upon industry's abusive invocations of civil rights such as the freedom of speech and freedom to contract.

[UPDATE, 23 Oct. 2019, at 13:53 U.S. EDT: Privacy Shield still good, per EC report issued today.]

Tuesday, December 11, 2018