gradient circle (purple) gradient circle (green)

Recognizing the Risks of Facial Recognition Technology

Recognizing the Risks of Facial Recognition Technology

CLE_RecognizingRisksOfFacialRecognitionTech_675x250

In our increasingly connected world, facial recognition technology (FRT) is becoming ubiquitous. It’s being utilized for everything from scanning passengers at airports to analyzing footage from bodycams worn by police officers. While there are many promising and beneficial uses of FRT, its use by private entities and law enforcement raises significant issues because studies have shown it to be inaccurate, and to have trouble accurately recognizing women and people of color. According to a recent report, tests conducted by the ACLU found that a facial-recognition software product incorrectly “matched” 27 professional athletes to mugshots in a law enforcement database. In this program, our expert panel will discuss some of the current uses of FRT, the reports that call into question its accuracy and whether it can be relied upon, and the legal issues this raises. 

The panel will examine the basis for claims that facial recognition technologies are biased and can unlawfully discriminate against legally protected groups including women and people of color. They will evaluate how judges may respond when parties seek to introduce this data from FRT into evidence, and how attorneys can be prepared to address, and advise their clients on, eliminating the unlawful bias that may result from the use of these technologies. The panel will cover the debate around the use of facial recognition technology by law enforcement, landlords, in schools and other places, recent laws and legislative efforts to ban such use, and the potential impact on privacy and civil liberties.

Program Speakers

Gail L. Gottehrer, Esq. – Law Office of Gail Gottehrer LLC

Ronald J. Hedges – Dentons US LLP

Non-Member Price: $200.00
Published Date:
  • March 18, 2020
Format:
  • Online On-Demand
Product Code:
  • VGR44
Ethics and Professionalism Credit(s):
  • 2.0
Total Credit(s):
  • 2.0
Recorded Date:
  • 2hr