Bienvenue

Bienvenue sur le Blogue du CRL du Jeune Barreau de Montréal (JBM)! Ce blogue est dédié à la diffusion de contenu juridique pour tous les avocats du Québec et plus spécifiquement pour les membres du JBM. Le contenu est offert grâce à une communauté d'avocats bénévoles impliqués sur le Comité recherche et législation du JBM. Si vous désirez devenir auteur ou contribuer au succès de ce blogue, faites-nous part de votre intérêt.

mercredi 13 mars 2019

AI-powered Legal Tech in a Human World: Should Justice be Served by a Robot?

By Anastasia Greenberg, Student, McGill University






With Katarina Daniels, Lawyer, Liaison Librarian at Nahum Gelber Law Library, McGill University






Imagine a world where robots are gainfully employed in the legal field. Traditionally, most legal associate work involves monotonous tasks such as sifting through hundreds of pages of documents to build a case, searching through thousands of results in a legal database, and drafting contract after contract with repetitive formats. While not exactly physical “robots”, artificial intelligence (AI) in the legal field promises to change all that by training computers to perform routine legal tasks with greater precision than human lawyers. A type of AI, Machine Learning, is the method behind Google’s search algorithms, driverless cars, and DeepMind’s success in beating the world champion in the game AlphGo. Now, AI has entered the legal field with great force. In 2018, investment in legal tech reached over $1.6 billion, representing a whopping 713% increase from the previous year. As of 2019, AngelList lists 948 legal tech startups. The United States’ online legal tech industry is now estimated at over $4 billion.


What Can AI do for Legal Practice?

AI legal tech tools can make legal practice more efficient and more accurate. ROSS Intelligence uses AI to interpret legal questions asked by the user in natural language and provides relevant search results. Where the platform is convinced that the result correctly answers the user’s query, it will label it a “deep match”. Similarly, Casetext leverages AI by allowing a user to upload a document that provides context for a search query, such as a fact pattern.

Other legal tech companies are using AI to go far beyond routine legal tasks, providing analytics that arguably exceed human capability. Lex Machina by LexisNexis applies natural language processing (a type of machine learning) to millions of legal decisions and provides critical insights such as which judges are more likely to favour plaintiffs, and what arguments are more likely to appeal to specific judges. Premonition goes even further by predicting outcomes of cases before trial, boldly stating: “Which attorneys win before which judges? Premonition knows”. The increasing sophistication of legal tech tools means that AI can not only do legal work faster, but perhaps better than biological lawyers.

Good Robot, Bad Robot

Are these tools as great as they are cracked up to be? One study showed that compared to manual review of discovery documents, technology-assisted document review was not only significantly faster but resulted in greater accuracy. However, AI-powered legal tech is far from perfect. Accurate results require training computers on high quality and comprehensive data. Legal disputes that settle out of court are proprietary and inaccessible to AI developers. Likewise, case decisions that are publicly available are published in a variety of formats, making it challenging to standardize them for training AI algorithms.

Issues with improper AI development can have serious consequences. Compas is used by judges across several US states to support sentencing and bail decisions. The algorithm takes into account individual factors such as gender, criminal history and makes predictions about future recidivism. It turns out that black defendants are twice as likely to be misclassified as high-risk for reoffending compared to white defendants – a faulty outcome known as “algorithmic bias”. AI development is designed to be a reiterative process. Troubling outcomes must be taken into account to continuously improve the technology.

Is AI Stealing Legal Jobs?

“Our goal for the next couple of years is to automate the entire legal world,” says a spokesperson for LawGeex, a contract review platform. Lawyers are terrified of such a dystopia. Some evidence suggests that law firms are either totally disinterested in implementing legal tech or are engaging with legal tech in a superficial manner to please clients. But the tide may be shifting. The Florida Bar was the first to mandate technology training as part of the continuing legal education requirements. Will lawyers eventually lose their jobs to powerful robots, turning LawGeex’s hopes into reality?

Instead of substituting legal jobs, AI and lawyers can make for a dream team. Take an example from the health care space. A recent study showed that when AI and radiologists worked together to detect metastatic cancer in lymph node biopsies, accurate detection reached 99.5% accuracy, higher than the accuracy of the physician or AI working alone. Richard Mabey, co-founder of Juro, a contract management platform, says: “The best lawyers have nothing to fear from technology. Ultimately if we tell smart lawyers we’re going to take work away from them, mostly they’re delighted.”

Improving Access to Justice

Legal tech promises to increase access to justice by allowing members of the public to navigate complex legal issues at lower price points than seeking traditional legal advice. It has been estimated that 80% of the United States population lacks adequate access to legal services. In Canada, between 30 and 80 % of defendants are self-represented depending on the province and legal matter. AI-based legal tech platforms can partially remedy access to justice issues by providing for user-friendly and affordable digital legal services. A young London-based entrepreneur, Joshua Browder, developed a free chatbot called DoNotPay that helps people challenge traffic tickets, make claims against Equifax, and seek recourse from landlords for contract violations. Students at the University of Cambridge created a chatbot that helps crime victims weigh their legal options.

Should the Legal Profession Jump on the AI Bandwagon?

There are clear risks associated with legal tech and AI, such as algorithmic bias, inaccurate results, and outcomes that fail to capture the nuances of critical legal thinking. Taking risks can also come with rewards. Legal tech can provide more efficient legal services at lower costs and more accurate results free of human error, thereby increasing the quality of legal services, and ultimately, improving access to true justice. All of this can be achieved through a fruitful partnership between lawyers and legal tech. Let’s create more jobs for robots.

Interested in AI and its impact on the legal profession? Be sure to attend this year’s Legal.IT conference! Hosted by the Young Bar of Montreal, this annual event is the largest conference in Canada to focus on the impact of IT and its potential in the field of law. Click here for details and to register now.

-------------------
Les chroniques du CTI sont rédigées par un ou plusieurs membres du Comité Technologies de l’information (CTI) dans le but de susciter les discussions et de soulever les réflexions au sein de la communauté juridique à propos des nouvelles technologies et le droit. Les auteurs sont donc seuls responsables du contenu des articles et l’opinion qui y est véhiculée n’est pas celle du JBM, mais bien celle des auteurs. Si vous désirez rédiger une chronique, envoyez un courriel au cti@ajbm.qc.ca.

To register at the Legal.IT Conference, click here. 

Aucun commentaire:

Publier un commentaire

L'équipe du Blogue vous encourage à partager avec nous et nos lecteurs vos commentaires et impressions afin d'alimenter les discussions sur le Blogue. Par ailleurs, prenez note du fait qu'aucun commentaire ne sera publié avant d'avoir été approuvé par un modérateur et que l'équipe du Blogue se réserve l'entière discrétion de ne pas publier tout commentaire jugé inapproprié.