Select from any of the filters or enter a search term
Calendar
Calendar

Regulating legal technologies

Regulating legal technologies

By Karin Derkley

Access to Justice Technology 

0 Comments


The rise of automated legal advice technologies (ALATs) is generating a slew of questions as to whether and how they can be regulated, a Melbourne Law School seminar was told last week.

The ‘Current State of Automated Legal Advice Tools’ discussion paper is the first publication from the Regulating Automated Legal Advice Technologies project, an initiative of the Melbourne Networked Society Institute.

Legal technologies could help address the growing access to justice gap, especially for the missing middle that has historically been unviable to serve, said one of the paper’s authors, the director of Melbourne Law School's Legal Professions Research Network, Professor Julian Webb.

But they also raised the question as to where to draw the line between what was legal information and what needed to be regulated as legal advice.

"The line between the two is not a bright one,” Professor Webb said. "So that raises the question as to whether legal information could be brought into the regulatory domain." The downside of that proposal, however, was that it could compromise the right of the public to access primary legal information unfettered by regulation, he added.

Then there was the question as to what constituted good enough quality when it comes to providing legal advice. One study found that a computer was able to predict outcomes in the Supreme Court of the United States with 70 per cent accuracy compared to 66 per cent for legal experts.

"If the risks are all positive, why do ALATs need to be used within a regulatory legal practice?” Professor Webb asked. “If the technology is appropriately designed and is representing legal outcomes appropriately, do we even need a lawyer?" 

However, the notion of converting law into binary right/wrong model is not unproblematic, he pointed out. "There is the question of who determines and ensures accuracy. In a way there is perhaps a subtle handover of control and of interpretation and construction of law from lawyers to programmers.”

The problem is that legal technologies that use artificial intelligence and deep learning are not transparent about the reasoning behind their decision, he said. "So when you're looking at artificial intelligence being used in bail applications, for instance, it might be replicating things we don't want replicated such as patterns of discriminatory decision-making."

Responding to the discussion paper, Victorian Legal Aid executive director, legal practice, Katie Miller said that the document helped “corral and classify” what had become a very busy area of the law in Australia in recent years.

In 2015, as president of the LIV, Ms Miller wrote a paper on the future of the legal profession: ‘Disruption, innovation and change’. Much had changed since then, she said. “Back then there was not a lot happening (in terms of legal technologies) in Australia, and I had to go to America. Three years after my visit to US so much is happening here it is difficult to get a grasp on it.”

She agreed that the rise of ALATs raise “so many questions re regulation and education”.

One fear was that “we might have a divergence where those on the technology side go off on their own path assuming they're not providing legal advice, and lawyers sit there too scared to do anything and we lose the benefits of both.”

“What we should be aspiring to is an integration between law and technology to get the strength of both,” she said.

There was also the concern that technology-aided legal decision-making could reverse the efforts of recent years to try to create a more transparent legal system, she said.

"We've never had 100 per cent transparency in law, and we've come to recognise that what we think of as impartial and fair and equal is actually shaped by societal pressures and discrimination.

"There is a fear that with technological tools that we could lose the momentum we have gained and that we will go right back to the beginning. These are difficult questions and we are going to need to go back to first principles."

The Regulating Automated Legal Advice Technologies project is seeking input from members of the profession, regulators, ALAT developers and producers, and access to justice groups for the next phase of the project.

You can read the discussion paper here.


Views expressed on liv.asn.au (Website) are not necessarily endorsed by the Law Institute of Victoria Ltd (LIV).

The information, including statements, opinions, documents and materials contained on the Website (Website Content) is for general information purposes only. The Website Content does not take into account your specific needs, objectives or circumstances, and it is not legal advice or services. Any reliance you place on the Website Content is at your own risk.

To the maximum extent permitted by law, the LIV excludes all liability for any loss or damage of any kind (including special, indirect or consequential loss and including loss of business profits) arising out of or in connection with the Website Content and the use or performance of the Website except to the extent that the loss or damage is directly caused by the LIV’s fraud or wilful misconduct.

Be the first to comment