Tuesday, February 6, 2024

AI can make law better and more accessible; it won't

Gencraft AI image
Artificial intelligence is changing the legal profession, and the supply of legal services is growing even more disconnected from demand.

The latter proposition is my assessment, but experts agreed at a national bar conference last week that AI will change the face of legal practice for attorneys and clients, as well as law students and professors.

Lexis and Westlaw each recently launched a generative AI product, Lexis+ AI Legal Assistant and AI-Assisted Research on Westlaw Precision. One might fairly expect that these tools will make legal work faster and more efficient, which in turn would make legal services accessible to more people. I fear the opposite will happen.

The endangered first-year associate. The problem boils down to the elimination of entry-level jobs in legal practice. Panelists at The Next Generation and the Future of Business Litigation conference of the Tort Trial Insurance Practice Section (TIPS) of the American Bar Association (ABA) at the ABA Midyear Meeting in Louisville, Kentucky, last week told audience members that AI now performs the work of first- and second-year associates in legal practice.

The change might or might not be revolutionary. Popular wisdom routinely describes generative AI as a turning point on the evolutionary scale. But panelists pointed out that legal research has seen sea change before, and the sky did not fall. Indeed, doomsayers once predicted the end of responsible legal practice upon the very advent of Lexis and Westlaw in displacement of books and paper—a transformation contemporary with my career. Law practice adapted, if not for the better in every respect.

It's in the work of junior attorneys that AI is having the greatest impact now. It can do the background legal research that a senior lawyer might assign to a junior lawyer upon acquisition of a new client or case. AI also can do the grunt work on which new lawyers cut their teeth, such as pleadings, motions, and discovery.

According to (aptly named) Oregon attorney Justice J. Brooks, lawyers are under huge pressure from clients and insurers to use AI, regardless of the opportunity cost in bringing up new attorneys. Fortune 500 companies are demanding that AI be part of a lawyer's services as a condition of retention. The corporate client will not pay for the five hours it takes an associate to draft discovery requests when AI can do it in 1.5.

Observers of law and technology, as well as the courts, have wrung their hands recently amid high-profile reports of AI-using lawyers behaving badly, for example, filing briefs citing sources that do not exist. Brooks said that a lawyer must review with a "critical eye" the research memorandum that AI produces. Insofar as there have been ethical lapses, "we've always had the problem of lawyers not reading cases," Illinois lawyer Jayne R. Reardon observed.

Faster and cheaper, but not always better, AI. There's the rub for newly minted associates: senior lawyers must bring the same scrutiny to bear on AI work that they bring to the toddling memo of the first-year associate. And AI works faster and cheaper.

Meanwhile, AI performs some mundane tasks better than a human lawyer. More than cutting corners, AI sometimes sees a new angle for interrogatories in discovery, Brooks said. Sometimes AI comes up with an inventive compromise for a problem in mediation, Kentucky attorney Stephen Embry said. AI can analyze dialogs to trace points of agreements and disagreement in negotiation, Illinois lawyer Svetlana Gitman reported.

AI does a quick and superb job on the odd request for boilerplate, North Carolina attorney Victoria Alvarez said. For example, "I need a North Carolina contract venue clause." And AI can organize quickly large data sets, she said, generating spreadsheets, tables, and graphics.

What AI cannot yet do well is good jobs news for senior lawyers and professors such as me: AI cannot make complex arguments, Brooks said. In fact, he likes to receive AI-drafted memoranda from legal opponents. They're easily recognizable, he said, and it's easy to pick apart their arguments, which are on par with the sophistication of a college freshman.

Similarly, Brooks said, AI is especially bad at working out solutions to problems in unsettled areas of law. It is confused when its training materials—all of the law and most of the commentary on it—point in different directions. 

In a way, AI is hampered by its own sweeping knowledge. It has so much information that it cannot readily discern what is important and what is not. A lawyer might readily understand, for example, that a trending theory in Ninth Circuit jurisprudence is the peculiar result of concurring philosophical leanings among involved judges and likely will be rejected when the issue arises in the Fifth Circuit, where philosophical leanings tend to the contrary. AI doesn't see that. That's where human insight still marks a peculiar distinction—for now, at least, and until I retire, I hope.

It's that lack of discernment that has caused AI to make up sources, Brandeis Law Professor Susan Tanner said. AI wants to please its user, Oregon lawyer Laura Caldera Loera explained. So if a lawyer queries AI, "Give me a case that says X," AI does what was asked. The questioner presumes the case exists, and the AI follows that lead. If it can't find the case, it extrapolates from known sources. And weirdly, as Tanner explained it, "[AI] wants to convince you that it's right" and is good at doing so.

Client confidences. The panelists discussed other issue with AI in legal practice, such as the importance of protecting client confidences. Information fed into an open AI in asking a question becomes part of the AI's knowledge base. A careless lawyer might reveal confidential information that the AI later discloses in response to someone else's different query.

Some law firms and commercial services are using closed AIs to manage the confidentiality problem. For example, a firm might train a closed AI system on an internal bank of previously drafted transactional documents. Lexis and Westlaw AIs are trained similarly on the full data sets of those proprietary databases, but not, like ChatGPT, on the open internet—Pornhub included, clinical psychologist Dan Jolivet said.

But any limited or closed AI system is then limited correspondingly in its ability to formulate responses. And closed systems still might compromise confidentiality around ethical walls within a firm. Tanner said that a questioner cannot instruct AI simply to disregard some information; such an instruction is fundamentally contrary to how generative AI works.

Law schools in the lurch.  Every panelist who addressed the problem of employment and training for new lawyers insisted that the profession must take responsibility for the gap that AI will create at the entry level. Brooks said he pushes back, if sometimes futilely, on client demands to eliminate people from the service chain. Some panelists echoed the tantalean promise of billing models that will replace the billable hour. But no one could map a path forward in which there would be other than idealistic incentives for law firms to hire and train new lawyers.

And that's a merry-go-round I've been on for decades. For the entirety of my academic career, the bar has bemoaned the lack of "practice ready" lawyers. And where have practitioners placed blame? Not on their bottom-line-driven, profit-making business models, but on law schools and law professors.

And law schools, under the yoke of ABA accreditation, have yielded. The law curriculum today is loaded with practice course requirements, bar prep requirements, field placement requirements, and pro bono requirements. We have as well, of course, dedicated faculty and administrative positions to meet these needs.

That's not bad in of itself, of course. The problem arises, though, in that the curriculum and staffing are zero-sum games. When law students load up on practice-oriented hours, they're not doing things that law students used to do. When finite employment lines are dedicated to practice roles, there are other kinds of teachers absent who used to be there.

No one pauses to ask what we're missing.

My friend and mentor Professor Andrew McClurg, retired from the University of Memphis, famously told students that they should make the most of law school, because for most of them, it would be the last time in their careers that they would be able to think about the law.

Take the elective in the thing that stimulates your mind, McClurg advised students (and I have followed suit as an academic adviser). Explore law with a not-nuts-and-bolts seminar, such as law and literature or international human rights. Embrace the theory and philosophy of law—even in, say, your 1L torts class.

When, like my wife once was, you're a legal services attorney struggling to pay on your educational debt and have a home and a family while trying to maintain some semblance of professional responsibility in managing an impossible load of 70 cases and clients pulling 24/7 in every direction, you're not going to have the luxury of thinking about the law.

Profit machines. What I learned from law's last great leap forward was that the "profession" will not take responsibility for training new lawyers. Lawyer salaries at the top will reach ever more for the heavens, while those same lawyers demand ever more of legal education, and of vastly less well compensated legal educators, to transform and give of themselves to be more trade school and less graduate education.

Tanner put words to what the powers-that-be in practice want for law schools to do with law students today: "Train them so that they're profitable."  In other words, make billing machines, not professionals.

Insofar as that has already happened, the result has been a widening, not narrowing, of the gap between supply and demand for legal services. Wealthy persons and corporations have the resources to secure bespoke legal services. They always will. In an AI world, bespoke legal services means humans capable of discernment and complex argument, "critical eyes." 

Ordinary people have ever less access to legal services. What law schools have to do is expensive, and debt-burdened students cannot afford to work for what ordinary people are able to pay.

A lack of in-practice training and failure of inculcation to law as historic profession rather than workaday trade will mean more lawyers who are minimally, but not more, competent; lawyers who can fill out forms, but not conceive new theories; lawyers who have been trained on simulations and pro bono hours, but were never taught or afforded an opportunity to think about the law

These new generations of lawyers will lack discernment. They will not be able to make complex arguments or to pioneer understanding in unsettled areas of law. They will be little different from and no more capable than the AIs that clients pay them to access, little better than a human equivalent to a Staples legal form pack.

These lawyers will be hopelessly outmatched by their bespoke brethren. The ordinary person's lawyer will be employed only because the economically protectionist bar will forbid direct lay access to AI for legal services.

The bar will comprise two tribes: a sparsely populated sect of elite lawyer-professionals, and a mass of lawyer-tradespeople who keep the factory drums of legal education churning out form wills and contracts to keep the rabble at bay.

The haves and the have nots. 

It's a brave new world, and there is nothing new under the sun.

The first ABA TIPS panel comprised Victoria Alvarez, Troutman Pepper, Charlotte, N.C., moderator; Laura Caldera Loera and Amanda Bryan, Bullivant Houser Bailey, Portland, Ore.; Professor Susan Tanner, Louis D. Brandeis School of Law, Louisville, Ky.; and Justice J. Brooks, Foster Garvey, Portland, Ore. The second ABA TIPS panel referenced here comprised Svetlana Gitman, American Arbitration Association-International Center for Dispute Resolution, Chicago, Ill., moderator; Stephen Embry, EmbryLaw LLC and TechLaw Crossroads, Louisville, Ky.; Reginald A. Holmes, arbitrator, mediator, tech entrepreneur, and engineer, Los Angeles, Cal.; and Jayne R. Reardon, Fisher Broyles, Chicago, Ill.

No comments:

Post a Comment