Don’t Get Fooled by Deepfakes
3.4.2025

The technology to create deepfakes is getting more and more sophisticated. With a click of a button, nearly anyone can falsify evidence – including photographs, video, and audio.
A panel hosted by the New York State Bar Association delved into what attorneys should know about the technology behind deepfakes, including generative artificial intelligence, as well as how legal professionals can protect digital evidence in court.
The speakers were:
- Justice Tanya R. Kennedy, associate justice of the Appellate Division, First Department, New York State Supreme Court and chair of the New York State Bar Association’s Judicial Section.
- Daniel Capra, Philip Reed professor of law at Fordham Law School in New York City.
- Maura Grossman, computer science research professor at the University of Waterloo in Ontario, Canada.
- Jerry Bui, CEO of Right Forensics in Plano, Texas.
Nearly 300 people attended the webinar. Justice Kennedy gave introductory and closing remarks.
Delving Into Deepfakes
Many of the tools used to create deepfakes are inexpensive and easily accessible.
“Generative AI is based on models trained on large amounts of text, images, and audio data,” said Bui. “And it’s the interaction of that prompt interface that really gave it popularity – much like how Google’s single search bar really gave us access to the World Wide Web at large.”
When there is a lot of images, video and audio of a person – like a celebrity or a politician – that gives AI models more data to mimic how a person looks and talks. Bui described a case from last year where a political consultant made fake robocalls of President Joe Biden to discourage people from voting in the New Hampshire primary.
Some emerging deepfake technologies can create even more detailed false content. Bui showed a deepfaked video of himself – speaking languages he doesn’t speak.
“My likeness is deepfaked, my voice is deepfaked,” he said of the video. “Even the English output that you saw was scripted. But what the generative AI technology is able to do is also translate it. So that gives you an idea of how not only can it clone the voice, but it can also put it into different languages like French, German, and Japanese. So it can imbue a person with capabilities that they never actually had. And you could think about how that could be used fraudulently.”
There is also technology to fake eye contact in a live video, so that someone who is reading from a script and looking off to the side can appear as if they are looking directly at a camera.
Preventing Deepfakes in the Courtroom
One of the challenges of detecting deepfakes is that even the technology used to detect deepfakes can be tricked.
“Part of the problem is that you have to understand how these things are made,” said Grossman. “There are two algorithms that are competing against each other: one that generates content, and the other that discriminates content,” said Grossman. “And the discriminator gives [the generator] feedback, and as it gives it feedback, the generator gets better. So as your discriminator gets better, your generator gets better, so it’s very hard to build a discriminator that’s going to be better than your generator, because as soon as it gets really good, it’s used to better train the tool.”
Human experts can find subtle details – a flat vocal tone, shadows that do not line up – to determine if content is fake. But this also gets harder to determine as technology gets better at imitating life.
The panelists suggested ways to protect digital evidence in the courtroom, including a standard to figure out if AI-generated evidence is valid and reliable by examining how it was made as well as its standards, process, results, and training data.
“Another thing that judges can do is just to stay informed,” said Justice Kennedy. “Like we’re doing today with respect to these Continuing Legal Education programs, just to stay informed about the technology. And to also work with experts to advise us on these issues – to have consultations with the experts.”
The program was presented by the association’s Judicial Section and the association’s Committee on Technology and the Legal Profession.