Lawyers Have a Duty To Understand the Implications of Artificial Intelligence

By Will Hughes

October 6, 2022

Lawyers Have a Duty To Understand the Implications of Artificial Intelligence

10.6.2022

By Will Hughes

Artificial Intelligence as a litigation tool, the ethical duties of attorneys using new technologies and a demonstration of a powerful new search engine called Merlin were the subject of the second part of the New York State Bar Association’s  webinar, “Intro to Artificial Intelligence.”

Sponsored by the NYSBA Committee on Technology and the Legal Profession and the Committee on Continuing Legal Education, the webinar included experts Ron Hedges, John Tredennick and Jerome Greco and was hosted by Emillee Sahli, founder of Sahli Law, whose practice is focused on technology and the law.

“Artificial Intelligence is a very technical field,” said Hedges, who is senior counsel at Dentons. “Lawyers should understand what AI is and how to use it.”

He cited an August 2019 resolution by the American Bar Association House of Delegates urging “courts and lawyers to address the emerging ethical and legal issues related to the usage of artificial intelligence in the practice of law.”

Hedges noted the ABA, aware of the increasing use of AI in litigation, has advised attorneys to be mindful of the ethical considerations of its use, including potential bias of the technologies and the security and competency of vendors.

Hedges offered two distinct definitions of AI:

According to Paul W. Grimm et al, “Artificial Intelligence is the hypothetical ability of a computer to match or exceed a human’s performance in tasks requiring cognitive abilities, such as perception language understanding and synthesis, reasoning, creativity and emotion.”

While Lauri Donahue’s “A Primer on Using Artificial Intelligence in the Legal Profession” talks about machine learning, as “an application of AI in which computers use algorithms (rules) embodied in software to learn from data and adapt with experience.”

Some examples of public use of AI include identification software in the workplace or public spaces like airports; security (cellphones and banking); law enforcement; retail; marketing and human resources, with new applications appearing almost every day.

“As members of the bar, we have a duty to be competent,” said Hedges, “that includes keeping current with technology and understanding the risks and benefits of technologies.

“We’re going to be dealing with these technologies, your adversaries may be dealing with these technologies and you have to understand something – learn it yourself or associate yourself with someone who does.”

Hedges cautioned that when dealing with AI tools or any other tools, “you have to think about how you communicate, how you store information and produce information in a way that maintains client confidences.”

He noted the traditional standards of competence and confidentiality apply to new and emerging technologies. “You have to be able to supervise the people who provide the technology and understand what someone is doing for us.”

John Tredennick, CEO of Merlin Search Technologies, echoed Hedges, saying, “Confidence is essential to our profession. Competency in the tools we use is not different. Every lawyer has to be competent but we rely on others to assist us when things get beyond our competence.

“It’s not much different when we talk about these advanced uses of technology but rather an extension of what we all learned in law school: There are times when you are going to associate with others to supplement your expertise.” Tredennick used the example of an attorney from one state enlisting the aid of an attorney in a different state to litigate a case in the latter’s jurisdiction.

Jerome Greco, supervising attorney, Digital Forensics Unit, Legal Aid Society, agreed, “The most important thing in terms of competency is knowing when you need to talk to someone else or bring someone else in.”

Greco described three attributes of “trustworthy” AI. According to Greco, trustworthy AI should be:

  • Lawful, complying with all applicable laws and regulations.
  • Ethical, ensuring adherence to ethical principles and values.
  • Robust, both from a technical and social perspective, since, even with good intentions, AI systems can cause unilateral harm.

Greco, cited four principles from “Ethically Aligned Design: A Vision for Prioritizing Human Well-being  with Automated and Intelligent Systems” to be considered by attorneys before adopting AI applications. These include: effectiveness, competence, accountability and transparency.

Facial recognition is one such application of AI. Using databases of photos like mugshots and driver’s licenses, airport security systems, for example, attempt to identify individuals through biometrics, basically a mathematical model of features like the shape of the face or the distance between the eyes. Computers learn the model and scan huge volumes of video data, searching for matches, usually in milliseconds.

In addition to facial recognition, Sahli identified other forms of AI found with increasing frequency in litigation and criminal proceedings. These include automated transcription of audio and video data; motion tracking and object recognition (like weapons or drugs); and automated redaction (to protect the identity of witnesses, for example).

“We are seeing increasingly, noticeable amounts of audio and video from prosecutors and defense teams,” said Greco, noting that it is difficult and time consuming to go through hours of videotape or thousands of emails or hundreds of calls from correctional facilities. “Sometimes you are just looking for a particular segment or certain words.”

While these adaptations can save time and reduce client expense, they are not without problems. “Facial recognition, determining who is in a video, has a long way to go,” he said.

Automatic redaction has promise to protect confidentiality.

“Sometimes you have to suppress certain faces, license plates, computer screens, things in the background which is problematic,” said Greco, adding, “the trade-off is that automated is not perfect. Manually is more accurate but will take a lot longer to do.”

Sahli, who is a criminal defense lawyer, raised the questions, “When do I know I need these tools? When will they be useful? Attorneys need to understand what the tech can do and what it cannot do. What are the best practices to select the right tools for the case.”

She described several new AI programs for attorneys but noted problems with even the latest technology; problems like poor recording quality or low lighting levels in videos.

“Video recognition and license plates and the like,” said Tredennick, “If a human did it, it would be better but take much longer. That is a truthful statement but our audience should know that it is often a mistake to think human review of any type as a gold standard. We get tired. We think about sports scores. Studies show we miss things.

“The gold standard of document review was thought to be human eyes but research shows that is only 65% accurate. The human side is far from perfect in almost every instance.”

Tredennick demonstrated his Merlin Search Technology, which is based on the way that Pandora finds music catered to individual’s tastes.

Merlin is able to accurately find specific information in large information sets, literally in a fraction of a second.

“Criminal cases come packed with hundreds of thousands of documents. The volume of digital information is exploding. For 40 years we had only one tool: keyword search.

“Like human review, we thought it was the gold standard but Intel research found it doesn’t really work.”

Keyword search, according to Tredennick, “finds way too much of what you don’t want and far too little with what you do want. It leaves you with not knowing what you are missing.”

Tredennick pointed out that language is imprecise and ambiguous. Words have a lot of meanings. “When you search you find documents that may have the term but not exactly what you are looking for.”

He cited an example of a search of Florida emails having to do with the plight of the manatee. The search was complicated. Florida is not only the home of the endangered creature but also Manatee County. While documents relating to the large, aquatic animals was relevant to the search, the county of Manatee was not.

“The matter is further complicated by the many different ways to say the same term. And keyword searches do not find context. There are always issues with spelling, abbreviation, code words, missed documents and the like,” said Tredennick.

While Boolean search improved basic keyword search by incorporating the commands: And, Or and Not, “it is still very unforgiving. It will run but bring back results that you do want,” said Tredennick.

Merlin is able to learn specifically what the user is searching for and rapidly parse the results. Poor or unsuitable results are eliminated through the use of a thumbs up or thumbs down command that continually refines the results.

“You do not always have to know why it is working if you know how it works and what it can do for you,” said Tredennick.

Following the seminar, Sahli led the participants in a vigorous discussion of the problems, benefits and potential uses of AI by attorneys.

To order the video replay of this event, go here. 

 

 

 

 

 

 

Six diverse people sitting holding signs
gradient circle (purple) gradient circle (green)

Join NYSBA

My NYSBA Account

My NYSBA Account