this product is unavailable for purchase using a firm account, please log in with a personal account to make this purchase.

The LIV is currently closed to all visitors.

We are working remotely to deliver member services. For more information visit our 

COVID-19 Hub
Select from any of the filters or enter a search term

Practice management: Technology – it’s about people and processes

Practice management: Technology – it’s about people and processes

By Roshan Kumaragamage

Practice Management Technology 


It is worth examining the often-forgotten factors that impact a technology implementation, big or small, into a legal practice.

  • This article examines how technology can be introduced into legal practice in a way that mitigates the risk.
  • It considers the factors most often forgotten in technology implementation – people, processes, indirect costs and commercial models.
  • It also offers practical guidance for implementing technology in a firm.

It’s a commonly held belief that introducing technology will immediately make your business more efficient. This is perceived especially true for legal businesses, which are considered slow to adapt and ripe for change. The rise of AI has only accelerated the perception that legal businesses need technology-driven solutions to remain competitive. 

While all this may be partly true, simply acquiring technology won’t, on its own, solve a business or legal problem. If implemented poorly, new technology may make existing inefficiencies worse. In the most unfortunate cases, implementing technology poorly may have significant negative impacts on staff, clients or the community. 

Is it a technology problem? 

Technology should be considered an enabler, rather than the solution to business challenges. The point of failure for many technology projects is right at the start, when a team sets out to simply buy the best software they can find. Instead, it’s important to first look at the processes the technology will connect, and the people who will drive those processes. 

Sometimes what appears to be a shortfall in technology may instead be a gap in training. Other times, users may be locked in antiquated record and file keeping practices once developed to meet compliance requirements but never reviewed. It may be tempting to purchase software that helps users process invoices three minutes faster but, if they still spend 20 minutes filing those invoices, the technology has addressed the wrong problem. 

Deploying technology should always start with an internal review which considers the people and process factors, the challenge a technology will address and what constitutes success. 

Is your business ready for change?

Readiness for change will make or break success. In client servicing technology, this extends to readiness among clients and other stakeholders. Readiness is not, however, merely an environmental factor outside your control – it is the result of effective planning, communication and stakeholder engagement. 

With smaller more targeted technology, it’s tempting to think that change management won’t be a problem. The challenge might be so acute, and the solution so obvious, that it feels certain the technology will be adopted quickly and successfully. Resistance to change is, however, a powerful and unexpected force. New technology, even when objectively beneficial, may be rejected for myriad reasons such as the (perceived or actual) need for records in the “old” format, client reporting obligations, the time required to learn new systems or a simple fear of the unknown. 

A complex change management program won’t always be necessary, but it’s vital that some measures are taken to prepare stakeholders for the new technology. This may be as simple as a short tutorial with regular follow-up Q&A sessions to encourage uptake. The time and energy dedicated to change management should be commensurate with the expense (or expected benefit) of the technology itself. 

Commercial benefits

How commercial benefit will be realised from a technology is often overlooked. There is usually a vague expectation of benefit, tied to the belief that technology automatically increases efficiency and benefit is therefore inevitable. Measuring the benefit is, however, an important part of understanding whether you’re getting a return on your investment. Commercial benefits of some technology, such as client service tools which deliver a specific outcome, are tangible and easy to measure. It’s harder to track the benefits of technology that sits behind the scenes, incrementally making users more efficient. Measuring the benefits of technology such as AI, blockchain or robotic process automation may be harder still. 

Commercial benefits come in many forms, including those that aren’t immediately tangible. Will the technology make you more competitive? Will it generate direct revenue? Will it improve staff satisfaction? Will it increase profit margin or perhaps create new business opportunities? Will clients be charged for client-servicing technology, especially if it reduces billable hours? If not, how will you capitalise on the other benefits, such as increased competitiveness, client satisfaction or improved brand to make a return on your technology investment? 

Costs: Upfront, ongoing and hidden

Determining the true cost of a technology is sometimes far more complex than it seems. Costs are highly subjective, but there are common factors in upfront, ongoing and hidden costs. Upfront costs could include direct licensing or software costs and the costs required to get your business ready for the technology. Ongoing costs could include monthly charges, support or maintenance fees. They should account for the growth of data requirements and new licences that could be required if the product succeeds. Hidden costs could include the need to retain a specialist skill set once the system is embedded, increased cybersecurity requirements or insurance costs. 

The total cost of some technology may change dramatically over time, especially if cost is tied to data growth, number of outputs (such as per-document or per-case charges) or number of users. Hidden costs are often overlooked entirely. Model a few different future scenarios (a/x users, b/y data, c/z cases) to understand how cost may change. This is particularly useful if you’re comparing products. 

Picking the right technology

As Albert Einstein said, if you cannot explain it simply, you don’t understand it well enough. This is especially true of legal technology products. If the functionality, outputs and value of the technology cannot be explained without jargon or buzzwords, there is probably more investigating to do. 

Picking the right technology, and comparing alternative products, is easier if you’ve effectively scoped business challenges and have a clear vision of success. It’s important to be objective and data-driven when reviewing technology. Where possible, review metrics and avoid emotive decisions based solely on product demonstrations. As with all products, technology vendors will emphasise the most marketable features and downplay weaknesses. 

For small or targeted technology, a simple checklist itemising the required features or capabilities will suffice. Prioritising the items in your checklist will help you pick between similar products. 

At the other end of the spectrum, reviewing AI or robotic process technology may feel like swimming in the dark. The benefits of such technologies are significant but it’s important to consider the risks, which may be commercial, legal or even reputational. “Learning” systems, which use AI to change behaviour over time, need to be reviewed for out of the box functionality, effectiveness of learning models and resilience to biases. Issues resulting from machine learning and bias in retail products are routinely in the news; legal-specific products are not immune – seek expert guidance when necessary. 

Cybersecurity and information governance

Unfortunately, cyber-breaches are now commonplace, with law firms considered targets of choice by hackers. The legal profession reported 70 cybersecurity incidents to the Australian Cyber Security Centre in FY20.1 There have been several high-profile cyber-attacks affecting the profession and related services in the year since. Effective cybersecurity controls should be core requirements, not optional extras, for any modern technology. Technology that does not clearly define its cybersecurity posture should automatically be disqualified from further consideration. 

You should also clearly understand how the technology will use the data you input, where it will be stored, and who will have access to it. This is particularly significant for your client’s personal information, but also applies to information about you, your business and staff. 

While cyber-threats and compliance are constantly changing, some common security considerations are set out below. What type of data will be stored in the system (client information, business information, personal details, financial details etc)? How long will that data be kept? Do you know your compliance obligations around the retention and security of data, including your obligations if that data is stored overseas? If a system is client-service related, what happens to the data when a matter is closed or archived? Can the system be protected by multi-factor authentication? Who will have administration privileges (within your organisation and externally)? Will the new system introduce a new “point of entry” and, if so, has that been protected? 

The resources on the Australian Cyber Security Centre website are an excellent starting point for individuals and business.2

Starting small

A Proof of Concept (PoC) and pilot group are useful programs for new deployments. In “experimental” deployments, PoCs will help you better define requirements and success criteria. They should also help you assess value and identify the limits of the technology. There must be an impartial and clear stop/go review at the end of any PoC. It’s important to prevent momentum moving a project forward if there are still questions about business benefits or functionality. 

Pilot programs are useful when implementing technology for complex processes or larger groups. Picking the right pilot group is just as important as the functionality of the technology. The group should be ready for change and accept there will be roadblocks. They need to have the time to properly use the technology and provide structured feedback about benefits and issues. The best pilot groups resist the temptation to revert to the familiar and become champions of the new system. 

Accepting imperfection

Technology is rarely one size fits all and it’s very likely that version one of an implementation will not be perfect. New technology should address most of the requirements it was meant to, but there will be scenarios or specific users for which a solution remains elusive at first deployment. This is a challenging concept for a profession underpinned by accuracy and attention to detail, but it may be best to simply accept the initial imperfection of the system. 

It’s important to balance the benefit of solutions to the remaining issues against the inevitable cost and delay of deployment. “Business critical” technology should obviously be well tested and highly accurate. In other cases, the projected delay in a software upgrade or fix might make delaying a launch unpalatable. It’s also important to consider your stakeholders – can the imperfections (and subsequent inconveniences) be managed so that stakeholders don’t lose confidence in the new system altogether? 

The processes you’ve mapped, technology requirements you’ve gathered and the change and expectation management you’ve conducted will all be important in identifying the right imperfections to accept at launch. 

Measuring success

Once a technology is deployed, how do you know if it’s meeting expectations? Usage statistics, such as how often people are using the new system and for how long, are an early barometer of success. Studying usage by working group or demographic may even help you identify specific areas for refinement or more systemic operational issues.

Process-mapping after new technology is deployed should demonstrate fewer steps, or less time per step or between steps. User feedback cannot be underestimated and should be highly encouraged. It is a very good sign when users suggest meaningful improvements or identify further process issues around the technology. 

Success of client-servicing technology is often easier to measure than internal operations technology. In electronic discovery, improvements using technology assisted review can be demonstrated on a simple spreadsheet which tracks document review speed. Reviewing the success of client-servicing technology has another important benefit – an opportunity to celebrate success with your clients. 

Sophisticated deployments are most successful when the technology and internal processes are continuously reviewed and improved. Many tools now have monthly or quarterly development cycles where major improvements are drip-fed over the year, rather than in a single annual update. With such rapidly evolving technology, consider implementing an internal quarterly or six-month health check, where representatives from your user-base consider how product updates will be used in your business. Health checks will help realise the best value from the technology as it evolves. 


Technology is often rejected by users if it is introduced without measures to manage change, even when benefits seem obvious. A tool may be introduced and widely adopted, only to find it creates insufficient improvement or questionable return on investment. These issues are often the result of a rush to find a technical solution to a non-technical problem. 

A successful technology deployment should be as much a project about people and processes as it is about the technology itself. Taking the time to map processes, consider requirements, identify success criteria, adequately manage expectations and clearly communicate with stakeholders will maximise the chances of success. ■

Roshan Kumaragamage is head of legal technology at MinterEllison. He advises on the use of technology in litigation and investigations, large transactions and operational efficiency using emerging technology. He was a founding member of NSW Future of Law & Innovation Committee.

  1. The ACSC Annual Cyber Threat Report, July 2019 to June 2020,
  2. Australian Cyber Security Centre website,

Views expressed on (Website) are not necessarily endorsed by the Law Institute of Victoria Ltd (LIV).

The information, including statements, opinions, documents and materials contained on the Website (Website Content) is for general information purposes only. The Website Content does not take into account your specific needs, objectives or circumstances, and it is not legal advice or services. Any reliance you place on the Website Content is at your own risk.

To the maximum extent permitted by law, the LIV excludes all liability for any loss or damage of any kind (including special, indirect or consequential loss and including loss of business profits) arising out of or in connection with the Website Content and the use or performance of the Website except to the extent that the loss or damage is directly caused by the LIV’s fraud or wilful misconduct.

Be the first to comment