AI moves along ‘hype cycle’ to make its mark on legal profession

0 3

A few weeks from now, ChatGPT — the chatbot that propelled generative artificial intelligence into the mainstream — will celebrate its second anniversary.

The occasion will also provide a natural moment of reflection for the legal industry, which was singled out by analysts as having a business model that was most likely to be disrupted by the advent of the large-language models that power generative AI tools.

However, as the generative AI “hype cycle” enters its trough of disillusionment, law firms are moving away from grand pronouncements to evaluating precisely how and where AI tools can be of most help. A report by consultancy Deloitte published this summer found that just 6 per cent of corporate clients believed they were benefiting from their law firms’ use of generative AI, yet more than 70 per cent said they expected the tech to result in cost savings for legal work and faster turnaround times.

Advocates of the integration of AI in law say such efficiencies are coming, if they are not already here.

Below are some key themes on which early adopters of the technology are now focusing:

Protecting data

Early on, law firms realised that a “one-size-fits-all” AI system would not work in a global industry that prizes privacy and confidentiality above all.

For a start, regulations mandate that some data must be kept locally, and the underlying models, such as OpenAI, are increasingly unavailable in certain countries, such as China. Tara Waters, partner at Ashurst until last month, finds that the tools required to determine which data can be used by a generative AI model, and where, do not “exist in any of the third-party platforms” — not even market-leading offerings such as Copilot, from Microsoft. For this reason, she described the firm as not “in a place where [it was] comfortable with Copilot”.

Ashurst is still in a “holding pattern” with some clients that need reassurance on how data will be sequestered and where it will be located for compliance purposes.

David Wakeling, partner at A&O Shearman, says his firm has 1.5bn documents that large tech companies might have been tempted to feed into an algorithm en masse, but “we need to protect client data”. Hence, his firm is only rolling out the full stack of AI features for certain clients and certain tasks.

Once the AI is used for cross-border transfers, he says, “it gets very complicated”. In that scenario “we have to look at processing only in that jurisdiction in order to meet those requirements . . . that tech stack suddenly gets difficult because you now need cloud processing in the Middle East, or in Switzerland or Singapore”.

Hogan Lovells’ bespoke AI tool, Craig, uses “auto-pseudonymisation”, auto-encryption and zero data retention — meaning its algorithm rapidly deletes information on which it has drawn its conclusions. Even so, partner Sebastian Lach says that, to comply with local rules, not all users will have access to a full suite of products.

Reducing drudgery

While he admits his firm may not be using the “most sexy” of AI applications, San Francisco-based Cooley partner Peter Werner says some of the tools not “specific to the delivery of legal services” can deliver immediate time savings. For example, the firm, which has offices in London and Brussels, uses the technology to summarise long strings of emails and allow staff to catch up on missed communications more rapidly after returning from holiday, or before a meeting.

Lach, at Hogan Lovells, who co-leads its tech entity Eltemate, also concedes that functions such as compiling the first draft of a contract or composing an initial briefing document may not seem revolutionary. But he argues that they replace “the work that nobody likes to do” and “free up your brain to take care of the really important stuff . . . to create justice, good results, good law”.

Protecting jobs

Last year, when warning that 300mn jobs could be lost to generative AI, Goldman Sachs analysts identified law as one of the most exposed sectors. Yet there is no sign of vast lay-offs; only lawyers reporting incremental efficiencies. A&O Shearman says, in some instances, the firm is witnessing a 20 per cent to 30 per cent productivity gain — or something like seven hours per contract review — thanks to its in-house system ContractMatrix, used by 2,000 lawyers at the firm. Overloaded associates are the heaviest users, at least for the moment, because they are in greater demand than ever.

As Waters predicts, it will be “a good three to five years before anyone in the industry actually has a clear view on how their business model might need to change”.

However, there is another concern beyond the long-term fear of staff cuts driven by AI efficiencies.

Cooley’s Werner worries that associates are “turning off their brain to create a first draft of a brief or the first draft of a contract”, and missing out on the in-depth training needed to think on their feet and advise clients. Thanks to its AI tools, Cooley staff can now form a company within minutes. But that is “just pressing buttons”, says Werner.

“How are we getting them to the point where, 15 years from now, they are a person who can teach a new generation how to be a great lawyer?”

In-house training

No single law firm — not even any of the sector’s big beasts — can afford to build its own large-language model, or to compete with heavily funded generative AI start-ups. But rather than settle for “out-of-the-box” solutions such as OpenAI, many have decided on a middle ground: training algorithms on their own data sets and intellectual property. Lach says the key has been “giving the AI good books to read” — or literature that is tailored to its lawyers’ and clients’ needs.

At Hogan Lovells, the outcome is Craig, which can help users with everything from navigating regulatory updates to drafting a prospectus for an initial public offering. The system is used both internally and externally by many of the firm’s larger clients.

A&O Shearman — which, as Allen & Overy, began rolling out the generative AI tool Harvey in late 2022 — found that lawyers were primarily using it to cure “writer’s block”, says Wakeling, while steering clear of more sophisticated uses for fear of getting inaccurate results.

The firm built its ContractMatrix drafting tool partly to manage so-called hallucinations — the delivery of false or misleading information. The output from the tool contains hyperlinks back to the original, signed-off work that is being referenced, giving lawyers the ability to easily check facts and conclusions at the source.

Read the full article here

Leave A Reply

Your email address will not be published.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy