Showing posts with label Clayton Utz. Show all posts
Showing posts with label Clayton Utz. Show all posts

Monday, September 13, 2021

'Don't panic,' lawyers say, as Oz High Court clears way for website liability over defamatory user comments


The High Court of Australia last week greenlit defamation claims against website operators for user comments, the latest evidence of crumbling global immunity doctrine represented in the United States by the ever more controversial section 230.

There is plenty news online about the Aussie case, and I did not intend to comment.  For the academically inclined, social media regulation was the spotlight issue of the premiere Journal of Free Speech Law.

Yet I thought it worthwhile to share commentary from Clayton Utz, in which lawyers Douglas Bishop, Ian Bloemendal, and Kym Fraser evinced a mercifully less alarmist tone when they wrote, "don't panic just yet."

The Australian apex court extended the well known and usual rule of common law defamation, when not statutorily suspended: that the tale bearer is as responsible as the tale maker.  In the tech context, in other words, "[b]y 'facilitating, encouraging and thereby assisting the posting of comments' by the public," the defendants, notwithstanding their actual knowledge or lack thereof, "became the publishers," Bishop, Bloemendal, and Fraser wrote.

But it's a touch more complicated than purely strict liability.  "What is relevant is an intentional participation in the process by which a posted comment may become available to be accessed by other Facebook users," Bishop, et al., opined.  "So does that mean you should take down your corporate social media pages? That would be an over-reaction to this decision."

The lawyers emphasized that this appeal was interlocutory.  On remand in New South Wales, the media defendants may assert defenses, including innocent dissemination, justification, and truth.  Bishop, et al., advise:

In the meantime, if your organisation maintains a social media page which allows comments on your posts, you should review your monitoring of third-party comments and the training of your social media team in flagging and (if necessary) escalating problems to ensure you can have respectful, non-defamatory conversation with stakeholders.

Funny they should say so.  Coincidentally, I gave "feedback" to Google Blogger just Friday that a new option should be added for comment moderation, something like "archive," or "decline to publish for now."  The only options Google offers are spam, trash, and publish.

I have two comments posted to this blog in recent years that I hold in "Awaiting Moderation" purgatory, because they fit none of my three options.  Every time I go to comment moderation, I have to see these two at the top.  The comments express possible defamation: allegations of criminality or otherwise ill character about third parties referenced on the blog.  I don't want to republish these comments, because I do not know whether they are true.  But I don't want to trash them, because they are not necessarily valueless.  Moreover, they might later be evidence in someone else's defamation suit.

I moderate comments for this blog, so I don't think it's too much to ask the same of anyone else who publishes comments, whether individual, small business, or the transnational information empires that peer over my shoulder.  

I do worry, though, about how that works out for the democratizing potential of the internet.  I'm trained to recognize potentially defamatory or privacy invasive content; I've done it for a living.  Are we prepared to punish the blogger who contributes valuably to the information sphere, but lacks the professional training to catch a legal nuance?  Or to pay the democratic price of disallowing dialog on that writer's blog?  As a rule, ignorance of the law is no excuse, in defamation law no less than in any other area.  But understanding media torts asks a lot more of the average netizen than knowing not to jaywalk.

I don't profess answers, at least not today.  But I can tell that the sentiment of my law students, especially those a generation or more younger than I, is unreticent willingness to hold corporations strictly liable for injurious speech on their platforms.  So if I were counsel to Google or Facebook, I would be planning for a radically changed legal future.