Select from any of the filters or enter a search term
Calendar
Calendar

A hot topic

A hot topic

By Julian Webb and Judith Bennett

0 Comments


As automated legal advice tools become more complex, so do the issues regarding their regulation.

Snapshot:

  • After years focused on process automation, legal technology is rapidly developing a range of tools automating legal information and advice functions.
  • Automated legal advice potentially disrupts the legal services market by opening up areas of latent demand, and by increasing pressures both on established market incumbents and on regulation itself.
  • Key challenges to regulation arise in setting quality and explainability standards for platforms, technological competencies for the profession and in determining how automation may change patterns of consumer risk.

Automation, and particularly its use of artificial intelligence (AI), is a hot topic in the legal services industry. Automation is increasingly being used to support process efficiency and – deploying a broad range of programming techniques ranging from human coding to advanced machine-taught learning – has great potential to support or even provide legal advice and information via lawyers and directly to consumers.

Using automated legal advice tools raises fundamental questions about the future of legal services, particularly how best to regulate these new technologies. However, the “framework which currently regulates legal services and lawyers is largely based on a model of legal practice that predates the internet”.1 Appropriate regulation will also be “likely to require a fine balancing act between the competing interests of consumers, the legal market, the legal profession and access to justice”.2 How will regulation cope in an age of machine learning and global hyper-connectivity?

A multi-disciplinary research project (RALAT) is currently exploring the role and governance of automated legal advice technologies (ALATs) in Australia. The research team is drawn from the Melbourne Law School, School of Computing and Information Systems, Department of Management and Marketing, and supported by the Networked Society Institute (NSI) at the University of Melbourne, together with the College of Law, ANU.3

The project’s objectives are to explore the emergence and development of ALATs, their impact and technological limits, and how they may transform legal practice in Australia as well as the challenges they raise for the current regulatory landscape. This article applies that work.

What are automated legal advice tools?

ALATS can be defined as tools that, via a range of techniques in the ecosystem of AI technologies, automate the giving of legal advice. ALATs access a broad range of programming techniques, including human coding, natural language processing and machine learning, to support or provide legal information and advice tailored to the circumstances of the individual.

By emphasising automated advice-giving, ALATs include tools that use legal analysis, legal reasoning, and prediction functions to give legal advice on their own, to give advice supervised or reviewed by a lawyer, to assist or augment legal advice given by a lawyer and to offer limited or partial legal advice by unbundling transactions into smaller discrete tasks.

Examples and categories

The scope and scale of ALATs is developing rapidly. While some expert legal systems were developed in the 1980s,4 the 2010s are seeing the emergence of a vigorous and qualitatively different “fourth wave” of intelligent automation research. As a result, most ALATs have entered the market since 2014.

Current tools see a wide spectrum of sophistication, from “simple” task-specific apps like “DoNotPay”5 assisting users in challenging parking ticket penalties, through to more flexible automated assistant programs and adaptive AI-based tools such as Ailira’s tax advisor answering questions in natural language.6

How to classify ALATs? One suggestion is classifying ALATs both by function and “intelligence” capability.

Functionally, five subsets of technology exist in the market:

  • specialised standalone technologies, such as legal chatbots, apps and virtual assistants
  • enablers of legal advice such as legal automated drafting, legal document review and legal algorithms
  • further enablers of legal advice such as legal data analytics and predictors, and legal artificial intelligence
  • automation of legal advice with truly smart contracts
  • sets of ALAT technologies and platforms enabling NewLaw business models and legal technology companies.

As to capability, a range of technologies are found to exist from “simpler” tools relying on human-coded decision trees through to “smarter” or “more intelligent” sophisticated technologies that use deep learning and can parse text, learn causations and correlations from data, and reason about these to make predictions.

An assessment based on product descriptions suggests that at present the majority of products are at the lower end of the “intelligence” scale. It is difficult to assess technology from the “outside” while some are commercial in confidence. One typology is as follows:

Impact on the legal services industry

Objective measures of impact are hard to find. The project sees a segmentation of ALATs across the current legal services market aligned with patterns of investment in new technologies. Currently the majority of ALATs operate in niche and specialised areas of law. Enabling ALATs seem more concentrated within the large law firm and in-house sectors while the legal assistance sector has seen more low-tech apps and chatbots. The proprietary, commercial nature of some in-house ALATs is likely to keep investment costs high so limiting the potential for smaller firms to benefit. This may change as and when white label applications are developed.

Longer term, ALATs may be disruptive by enabling broader access to legal information, increasing efficiencies and reducing costs, so opening up latent markets that are currently uneconomic. ALATs may also create new areas and forms of competition challenging market incumbents. For example, US corporates like LegalZoom and RocketLawyer, offering low cost legal document and advice services for smaller business and consumer legal services, show how non-lawyer entities may seek to enter and disrupt traditional markets.

Regulation: more questions than answers

There is growing recognition that automation raises important questions about the regulation of legal services and associated ethics, yet there has been limited in-depth discussion.7

This article explores ways that ALATs specifically challenge regulation, and may extend or create more issues for technological innovation generally.

Legal advice, capacity and risk

A fundamental issue is defining legal advice in the Australian legal system. The giving of legal advice is “at or near the very centre of the practice of law”.8 Regulation aims to separate the giving of (unregulated) legal information from the giving of (regulated) legal advice, with reasons being public protection and limiting of risk. Yet such boundaries between legal advice and information are grey for both regulators and innovators. In the context of emerging technologies such as ALATs, this raises questions about whether the distinction remains justifiable.

A related issue is the capacity to give legal advice. The Australian legal system aims to reserve the giving of legal advice work to the legal profession. The Legal Profession Uniform Law prohibits “unqualified entities” from engaging in legal practice (s10). However it then defines “legal practice” non-exhaustively and in a manner that appears somewhat circular.9 To ‘‘engage in legal practice” includes to ‘‘practise law or provide legal services but does not include engage in policy work” (s6). In turn, legal services are defined as “work done, or business transacted, in the ordinary course of legal practice” (s6 note). In the context of emerging technologies such as ALATs, this also raises questions such as how legitimate is it to continue restricting consumers to traditional service providers when ALATs can radically reduce the risks seen with “non-lawyer” advisors. For example, if ALATs can deliver routine and basic transactional legal tasks to a consistent quality every time, this can free up lawyers to think through problems and advise clients, with potential increases in the quality of advice and efficiency in delivery.

Further, does automation itself require us to think more carefully about risk-based regulation, rather than the current “one size fits all” model? These are potentially game-changing issues for the market and the profession.

Quality and legal competence standards

There is growing recognition that automated tools may carry out routine data analysis and pattern-matching tasks with greater reliability than humans. At the other end of the scale, AI may also prove itself better at some forms of predictive analysis. For example, a recent study of predictions of the decision-making behaviour of the US Supreme Court found legal experts have 66 per cent accuracy compared to 70 per cent for computers.10 Do these findings raise the quality threshold of competence for legal advice? Could the future be that not using a given technology itself is a failure of professional responsibility?

Technological competence

There is a distinction between being comfortable with technology as a digital native and being conversant with how technology is to be used in legal advice. What should the duty to deliver legal advice competently now require? Does a lawyer who provides a legal service supported by an ALAT need to understand how that technology works and to what extent? For example, when using an AI algorithm, does a lawyer need to understand the workings of the algorithm and integrity of the data?

The US legal profession has recommended extending the duty of competence to incorporate some technological competence. In 2012, the American Bar Association’s comments on its Model Rules of Professional Conduct added that a lawyer’s duty of competence include staying up-to-date with changes in relevant technologies.11 At least 25 states have adopted that with many also mandating technology-specific learning in CPD.12 Should we be following suit?

A black box?

Another regulatory issue for legal advice is created by the “black box” of some ALATs. With “simpler” human-coded tools, it is more likely that lawyers can identify the “decision tree” reasoning process enabling the tool to provide an answer. In contrast, ALATs that use self-directed machine learning or complex neural networks may not be transparent to lawyers (or indeed any other people) as to their assumptions,13 reasons or explanations for outcomes. Add to this, where the ALAT is a commercial service, owners may seek to protect their intellectual property by keeping the logic, source code and data confidential (eg, Compas Core, TrueAllele). Yet the ability to give clear reasons is seen as critical to sophisticated advice-giving by human lawyers. In contrast, decisions made by opaque algorithms are “analogous to evidence offered by an anonymous expert, whom one cannot cross-examine”.14

As automation becomes smarter and more intelligent with increasing use of machine learning using big legal data, the issues become more complex. How can we trace the logic? What legal data has been considered, seen as relevant, and how was it sourced? How did learning occur and were there any biases? And further, deeper issues arise. What values lie within the logic? Why has the data chosen been so chosen? What conscious or unconscious assumptions have been made that are not explicated?

Many AI researchers are seeking to find technological solutions to the “explainability” problem.15 However, is it enough to leave the question of transparency to producers and designers? We suggest lawyers and regulators should consider the setting of explainability standards. And if we do need explainability standards, what should these look like and how best do we keep these relevant and principled?

Next steps

The RALAT project will engage with a range of stakeholders: members of the legal profession, regulators, ALAT developers and producers, policy makers and members of the public, including access to justice groups. Some questions for discussion include:

  • to what extent (if at all), is the legal information/advice distinction a barrier to legal services innovation?
  • conversely, is there a case for bringing legal information substantively into legal services regulation? If yes, how might that best be done?
  • what additional challenges to quality of advice are created by ALAT technologies? Are specific new mechanisms or approaches necessary or adequate to regulate the quality of automated legal advice?
  • should the duty of professional competence be expressly extended to include an obligation to stay up-to-date with relevant technologies? Are there other professional obligations that may need to be revised in the context of increased automation of legal advice?
  • how is automation of legal advice affecting pricing and billing practices in the profession? How might it affect them in the next five years?
  • should explainability standards be developed for ALATs using “black box” automated intelligence? Why/why not?
  • are there critical areas of automation of legal advice that create risks for consumers? One example might be the separation between service provider liabilities under the general law (tort, Australian Consumer Law etc) and professional responsibility under the Uniform Law and equivalent legal profession acts.
  • how should or could regulation overcome the problem that online services may be delivered from outside the regulator’s physical jurisdiction?

Call for input

As an important conversation for the future of the legal profession, we welcome your input. What do you think in answer to our questions? Are there significant issues that we have missed? What questions would you ask?

If you would like to take part in our research, please contact the research team at law-lprn@unimelb.edu.au.

 

Professor Julian Webb is director of the Legal Professions Research Network at Melbourne Law School and a member of the Melbourne NSI. He is project lead on the RALAT project.

Judith Bennett is a manager, lawyer and director of www.business4group.com consulting, providing coaching and business advice to lawyers and law firms. She is studying a PhD at the University of Melbourne exploring technology and the business of the legal profession. She is a researcher with the RALAT project.

 

1. Law Institute Journal, “Disruption Innovation and Change: The Future of the Legal Profession”, December 2015 p35 www.liv.asn.au/Flipbooks/Disruption--Innovation-and-Change--The-Future-of-t.aspx.

2. Wallace, C, “Competition, Growth and Consumer Outcomes: Challenges for Policy” in Westminster Legal Policy Forum, 2017, Retrieved from www.westminsterforumprojects.co.uk/publications/westminster_legal_policy_forum.

3. See Bennett, J et al, "Current State of Automated Legal Advice Tools", Discussion Paper, NSI, University of Melbourne, April 2018, 69pp, <https://networkedsociety.unimelb.edu.au/__data/assets/pdf_file/0020/2761013/2018-NSI-CurrentStateofALAT.pdf>.

4. Susskind, R, Expert Systems in Law: A Jurisprudential Inquiry, Oxford University Press, 1987.

5. www.donotpay.com.

6. www.ailira.com.

7. See, eg, Note 1 above, pp34-35; Law Society of New South Wales, "Report of the Commission of Inquiry into the Future of Law and Innovation in the Profession", The Law Society of NSW, 2017, p112 <http://lawsociety.com.au/ForSolictors/Education/ThoughtLeadership/flip/index.htm>.

8. Cornall v Nagle [1995] 2 VR 188.

9. Beames, E, “Technology-based legal document generation services and the regulation of legal practice in Australia”, Alternative Law Journal, 2017, 42(4), pp297-303, doi:10.1177/1037969X17732709 at p298.

10. Katz, D, Bommarito II, M, & Blackman, J, A general approach for predicting the behaviour of the Supreme Court of the United States. Plos.org, April (12), 2017, Retrieved from journals.plos.org/plosone/ article?id=10.1371/journal.pone.0174698.

11. ABA, www.americanbar.org/groups/professional_responsibility/publications/model_rules_of_professional_conduct/rule_1_1_competence/comment_on_rule_1_1.html.

12. Note 6 above, at 41.

13. See eg, Spielkamp, M. “Inspecting Algorithms for Bias”, July/August 2017, MIT Technology Review (online), www.technologyreview.com/s/607955/inspecting-algorithms-for-bias/.

14. Brooks, M, “Artificial Ignorance”, New Scientist, 7 October 2017, pp28-33, (quoting Pasquale).

15. See, eg, Doran D, Schulz, S, & Besold, T, “What Does Explainable AI Really Mean? A New Conceptualization of Perspectives”, 2 October 2017, arXiv:1710.00794; Miller, T, Howe P & Sonenberg, L, “Explainable AI: Beware of Inmates Running the Asylum Or: How I Learnt to Stop Worrying and Love the Social and Behavioural Sciences”, 5 December 2017, arXiv: 1712.00547.


Views expressed on liv.asn.au (Website) are not necessarily endorsed by the Law Institute of Victoria Ltd (LIV).

The information, including statements, opinions, documents and materials contained on the Website (Website Content) is for general information purposes only. The Website Content does not take into account your specific needs, objectives or circumstances, and it is not legal advice or services. Any reliance you place on the Website Content is at your own risk.

To the maximum extent permitted by law, the LIV excludes all liability for any loss or damage of any kind (including special, indirect or consequential loss and including loss of business profits) arising out of or in connection with the Website Content and the use or performance of the Website except to the extent that the loss or damage is directly caused by the LIV’s fraud or wilful misconduct.

Be the first to comment