Privacy vs. Security: The Legal Implications of Using Facial Recognition Technology at Entertainment Venues
6.10.2025


Yet as is always true with technology, there is the potential for misuse and abuse by both private companies and the government. In the United States, there is no federal regulation of biometric data technology, which includes facial recognition technology, and only few state laws.[1] The New York State Legislature introduced a bill this session, the Biometric Privacy Act, that would require private entities to obtain informed consent before collecting, storing, or using biometric information.[2]
The potential uses of facial recognition technology are broad; it could be used to deny people access based on criminal history, where they work, or even their appearance. It has already been used for some of these things, even when not entirely ethical or legal. There are also increasing privacy concerns related to this technology, including undisclosed surveillance and the use of facial images after scanning.[3]
Background
Facial recognition technology has been around for some time. It was first pioneered by Woodrow “Woody” Wilson Bledsoe in the 1960s,[4] who developed a system of measurements that classified faces in photographs and allowed an unknown face to be compared against existing photos.[5] Due to the complex nature and novelty of the technology, it was difficult to gather sufficient high-quality photographs to create the robust set of data needed to broadly recognize new faces.[6] From the beginning, law enforcement had a keen interest in developing and using the technology.[7] Declassified documents from the CIA mention projects at Bledsoe’s company, Panoramic, and there are letters between him and CIA agents.[8] It is widely rumored that funding for his research came from the U.S. government, but nothing has been confirmed.
The technology in the area of biometric data has progressed since the 1960s and saw a strong shift in the 2000s. Facial recognition technology became dinner table talk after the 2001 Super Bowl, cleverly nicknamed the “Snooper Bowl.”[9] Law enforcement at the Super Bowl in Tampa were using the technology to identify felons and terrorists in the crowd,[10] after which many called into question the use of biometric data and how it was putting everyone “in a police lineup.”[11]
Now, users of facial recognition technology employ computer-generated filters to transform the image of someone’s face into numerical expressions that are then used to determine their similarity.[12] The basis of the technology is that everyone’s face is just a slight deviation of around 128 “standard” faces.[13] Police departments, schools, and businesses have all started using the technology for various functions.[14] It can be used to find suspects or missing persons, increase security measures, and take attendance at school.[15] A recent report found that more than half of adult faces in the United States are in a facial recognition technology database.[16] These photos are not coming from just mug shots; they are more often coming from various social media sites. Clearview AI has collected over 30 billion images from Facebook and other social media platforms to create its database.[17]
The facial recognition technology industry was valued at $3.8 billion in 2020 and is only expected to grow within the next 10 years.[18] The technology has entered the health care, government, and entertainment sectors, all presenting their own legal and ethical challenges.
Swifties or Stalkers
In 2019, before Taylor Swift went on tour for her “Reputation” album, Rolling Stone magazine revealed that her security would be implementing facial recognition technology.[19] Although Swift’s security team alleged that its use was purely to prevent known stalkers from attending her shows, many people had concerns with the fact that the scanning was undisclosed: Swift’s attendees were being secretly recorded by a kiosk that showed videos of her rehearsal process to concert-goers.[20] If businesses wish to use facial recognition technology, many people feel that at a bare minimum attendees should be put on notice that the technology is being used at the venue. California’s law regarding biometric data was enacted shortly after Swift’s team used this technology at the Rose Bowl.[21]
As with any facial scan technology used to root out stalkers, there has to be an initial watch list. As there is no set guideline of who makes the list and who does not, this can lead to bias. That said, Swift’s use of the technology has not led to a decline in attendees to her concerts. Quite the contrary, in fact; Swift earned the highest grossing concert sales for any female by earning more than $1 billion.[22] It seems, therefore, that the public may be willing to give up some privacy for their favorite artists.
Safety is and should be a priority for large event spaces, not only for the artists but also for those in attendance. Facial recognition technology is a valuable tool for security teams and it can be used to reduce harm to the artist and fans, but many argue that there should be a balance between safety and privacy.
James Dolan and Madison Square Garden
Madison Square Garden owner James Dolan came under fire more recently for his use of facial recognition in his properties. One woman was denied access to Radio City Music Hall to see the Rockettes with her daughter because she worked at a law firm that was in active litigation against MSG and Dolan.[23] The woman was not the attorney actively working on the case, but just happened to work at the same law firm and her photograph was included on the firm’s public website, which is likely why she was flagged[24] Others have been denied access to Rangers and Knicks games because they were tagged as having been outwardly critical of Dolan’s ownership of the teams.[25]
More recently, MSG was sued by a group of attorneys who had been denied access to the venue.[26] The plaintiffs alleged that MSG had violated Civil Rights Law Section40-b, which states in part:
No person, agency . . . corporation . . . of any place of public entertainment and amusement . . . shall refuse to admit to any public performance held at such place any person over the age of twenty-one years who presents a ticket of admission to the performance a reasonable time before the commencement thereof, or shall eject or demand the departure of any such person from such place during the course of the performance. . . The places of public entertainment and amusement within the meaning of this section shall be legitimate theatres, burlesque theatres, music halls, opera houses, concert halls and circuses.[27]
The appellate court declined to include sporting events in interpreting this statute.[28] The defendant’s motions to dismiss were granted, and the plaintiffs did not appeal.[29] In June 2023, MSG was again sued over its use of the technology by a Billy Joel concert attendee.[30] The plaintiff alleged that MSG violated both New York City and state law by using facial recognition technology without his permission and profiting off of the shared data.[31] The court ultimately denied MSG’s motion for summary judgment on the New York City biometric information privacy law claims in January 2024.[32] Later, in May 2024, the court granted MSG’s motion to dismiss.[33]
Many have criticized Dolan’s use of the technology, calling it “dystopian.” Some have been subsequently banned from his properties.[34] New York State Attorney General Letitia James has written to Dolan requesting documentation to justify his use of the technology.[35] The letter alleged that the technology violated New York City Human Rights laws and demanded that MSG turn over documents to show otherwise.[36]
Dolan continues to use this technology at all his venues, which include MSG, Radio City Music Hall, and The Sphere in Las Vegas. He also continues to actively defend its use publicly.[37] In response to the criticism, MSG sued the New York State Liquor Authority because it was attempting to revoke the liquor license from all of Dolan’s owned properties due to the use of biometric data.[38] The judge dismissed the lawsuit, but that did not stop Dolan. MSG went on to release a statement saying the “decision [will] not impact on our enforcement of our policy, which we will continue to vigorously defend.”[39]
Legality and Ethical Issues
Humans naturally have biases, and as algorithms are programmed by people and are comprised of data sets made up of people, those algorithms also naturally have biases. One of the major issues with facial recognition technology is its racial bias.
Alongside the ethical issues are the legal issues that have arisen over time because technology has rapidly outpaced law. Limited legal oversight allows companies to develop and implement new technology in business. This section will first discuss the algorithmic bias that exists in this technology, then detail different state laws that govern it, and finally look at how the Fourth Amendment could be used to protect against the use of facial recognition technology in law enforcement and by government bodies.
Algorithmic Bias
Much of the research regarding the biases of algorithms focuses on their usage in policing and criminal cases, but it still applies to the algorithms more generally. The bias issue first begins with their programming.
In a study titled The Gender Shades Project, algorithms consistently demonstrated that facial recognition software yielded the poorest accuracy for darker-skinned females.[40] This is because most datasets used to train the algorithm are of white males.[41]
Another issue with the programming of the algorithm is that the tests and datasets do not account for real-world conditions.[42] The tests being run to determine the accuracy of facial recognition technology are conducted over a period of months to years.[43] In a real-life scenario, the technology only has a few seconds to scan a person’s face and identify it. Further, if companies are using photos with optimal lighting and a clear background for their datasets, then the algorithm will not be able to accurately identify a face that has a grainy background and poor lighting.[44]
Another problem is the officers viewing the facial scans do not have the same level of training as an expert face examiner.[45] Only some law enforcement departments have dedicated training sessions for this technology.[46] Accordingly, there are essentially two levels of bias that exist within its use; the algorithm itself and the people using it.[47]
Even though companies claim that the technology has an accuracy rate of about 98%, once those results are broken down by race and ethnicity, the numbers look very different. For example, a study found that the software misclassified Black women almost 35% of the time.[48] Real-world conditions, race, training, and bias all contribute to making this reported accuracy rate even lower. In 2020, Robert Williams, a Black man,spent 30 hours in a Detroit jail because facial recognition technology incorrectly identified him as a suspect.[49] There are a handful of other instances just like his in other states across the country.[50]
State Law
There is no federal legislation related to facial recognition technology, and the states have been left to fill the gap.[51] Legislation at the state level mainly addresses biometric privacy.[52] In 2008, Illinois enacted the Biometric Information Privacy Act, the first law of its kind.[53] The main purpose of the statute was public safety and welfare in regard to privacy and sensitive information. The law requires notice to the person of the use of their biometric data and requires a written release from them.[54] It further restricts the sale of any person’s biometric data.[55] Illinois’s law is a great start for the regulation of facial recognition technology, something that other states and the federal government could model.
Two of the largest states involved in the entertainment industry, California and New York, have now created state laws that limit the use of biometric data, which may include facial recognition technology, depending on the language in the laws.[56]
California Consumer Privacy Act
In 2018, the California Consumer Privacy Act was enacted. It gives consumers more protection over the information that businesses can collect from them.[57] Under the CCPA, “biometric information includes, but is not limited to imagery of the . . . face.”[58] The plain language of the statute indicates that essentially any use of facial recognition technology would require a company to comply with the law. There are three different categories for businesses that are governed by the act. The first is businesses that have an “annual gross revenue in excess of [$25 million].”[59] Second are businesses that annually “buy[], sell[], or share[] the personal information of 100,000 or more consumers.”[60] Last are businesses that derive “50 percent or more of their annual revenues from selling or sharing consumers’ personal information.”[61] Based on these provisions of the law, the Legislature seems to be targeting larger businesses that inevitably encounter more of the public. Presumably, under the California Consumer Privacy Act, Swift’s security team’s use of facial recognition software would have been illegal, because it did not notify the attendees that their biometric data, in the form of a facial scan, was being collected.
Unlike Illinois’s law, the CCPA operates on an opt-out basis instead of requiring companies to obtain consent before collecting a faceprint.[62] The consumer can also request for their information to be deleted and the business must comply with that consumer’s request.[63] This “optout” scheme has been criticized by privacy experts because it places the burden on consumers and is often ineffective.[64]
Businesses and other states have expressed concern over the California law, often criticizing its broad language. Based on the definition of “business,” businesses outside of the state of California can still be held liable if they collect or sell the private information of California residents.[65] Scholars argue that the law violates the dormant commerce clause by placing an undue burden on businesses outside of California.[66] The undue burden placed upon other states could easily be solved by a uniform national standard regarding biometric data privacy.
California’s law is one of the most expansive and cutting-edge laws regarding not just facial recognition technology, but biometric data in general. However, there is always room for more protection and coverage.
New York
As an example of how state laws can create confusion and undermine consumer protection is illustrated by the case of two different MSG venues. Compare Radio City Music Hall in New York City with the MSG Entertainment-owned property in Illinois, The Chicago Theater. MSG cannot collect data in the way that it does at Radio City Music Hall at the Chicago venue, because in New York City, MSG does not obtain written consent. There is no such law in New York City or New York State that requires written consent like the Illinois law. The lack of laws in some states and conflicting laws in others make it substantially more difficult for attendees to understand their rights.
New York State enacted the Stop Hacks and Improve Electronic Data Security Act in 2019.[67] The SHIELD Act requires businesses to safeguard the private information of residents, like their driver’s license or Social Security information.[68] However, SHIELD makes no mention of facial recognition technology.[69] Since 2019, many bills have been introduced that specifically address the technology but they have had little success. Currently, the only state law on the books involves the use of facial recognition in schools.[70]
New York City has taken further steps in protecting attendees at entertainment venues by enacting the New York City Biometric Identifier Information Protection Code.[71] This code requires commercial establishments to disclose the use of any biometric technology and prevents those same establishments from selling or sharing the information.[72] Moreover, the law provides that the chief privacy officer will engage in education and outreach to inform establishments who may be affected by this law and what is required of them.[73] Businesses can still collect biometric data and utilize facial recognition technology, they just cannot sell the information. However, this law does not go far enough, and businesses have taken advantage of just how little is required of them.
Dangers and Recommended Solutions
The central concern of the use of facial recognition technology is privacy and security. With little to no regulation in place regarding its use in event venues, businesses can make use of it with few consequences. Realistically, how many people would stop going to events just because the venue uses facial recognition technology? Some artists have decided not to perform at venues that utilize it, but this may become unsustainable for artists as the technology becomes more common. Awareness of the potential future misuses of facial recognition technology is essential in the legislative process to ensure comprehensive laws. Businesses can also prevent misuse of facial recognition technology by limiting the situations in which it is used and allowing attendees to have their data deleted.
Potential Harms
Due to anti-discrimination laws, a business cannot deny entry to its venue based on an individual’s race, gender, disability, or other protected class. However, those laws do not protect against the racial bias that is embedded in facial recognition software. Letting businesses have unregulated access to such technology can still lead to discriminatory action taken by the businesses, whether they do it purposefully or not.
In the future, one’s face might be all that’s required for entry to events. In 2018, Ticketmaster stated that it envisioned replacing physical tickets with a facial scan.[74] Although there has been no further word on this plan, it does not seem that remote a possibility. The public has become more comfortable with being monitored constantly and companies are looking to take advantage of that. It is therefore even more important for the government to enact legislation before the technology becomes ubiquitous.
MSG has already provided the public with an example of how many people can be negatively affected by facial recognition technology, denying them entrance to events. Reasonable and innocent people have been banned from attending events solely due to their employment status.
Business Solutions
Despite the lack of laws, some companies have personally decided to not allow their data to be used by state or federal governments.[75] Following the murder of George Floyd, IBM, Amazon, and Microsoft all announced that they would pause or end the sale of their technology to police.[76] Furthermore, event venues have also stated they would no longer be using facial recognition technology as a security measure.[77] Unfortunately larger venues have been less receptive to dropping its use.
On a smaller scale, some artists have even refused to perform at stadiums and venues that utilize facial recognition technology.[78] Their reasons range from privacy concerns to the potential bias.[79] A boycott of venues was led by Fight for the Future, a digital rights advocacy group.[80] Again, however, the “big names” in music have not joined this boycott. As mentioned previously, Swift actively uses the technology at her concerts for security reasons.[81] To create change with businesses, these artists and advocacy groups have to hurt the pockets of the corporations. When the corporations can still guarantee that superstars will perform at their venues and bring in high revenues, there is no incentive for them to stop using the technology.
Legal Solutions
The immense potential and benefits of facial recognition technology cannot be overlooked. However, the negative consequences that we have already seen require an immediate response from lawmakers.
While some states have already acted against the use of facial recognition technology by the government and private businesses, a patchwork of different laws is plainly not as effective or comprehensive as federal regulation would be. At the very least, the government should create an agency that oversees the use of biometric data. The agency should ensure that technology systems are routinely inspected for potential biases based on the data they have collected. Furthermore, routine updates to the database of faces used is essential to decreasing algorithmic bias. Increasing the amount of data in the software will also decrease racial bias because the more facts the algorithm can see, the better it is going to be at accurately identifying one individual face. Regulating the government’s use of facial recognition technology is beneficial, but it’s not enough to protect citizens if private businesses are not similarly regulated.[82] There is a “tendency of governments to utilize private surveillance technology networks when a state-sponsored one is unavailable.”[83]
Conclusion
As this technology has evolved, it has raised many legal and ethical issues and will continue to do so in the future. Facial recognition technology is ripe for misuse by businesses, and in the U.S., we are not protected from these companies collecting and using this data without consumers’ knowledge. The federal government has left regulation largely up to the states, which has created a patchwork of laws. The entertainment industry has seen a wide range of ways that the technology can be used, and this usage will continue to grow. As the industry changes and evolves, so will the technology.
Kylie Ruff is a 2025 graduate from St. John’s University School of Law. After law school, Kylie hopes to work in trademark infringement litigation. This article is the Phil Cowan/Judith Bresler Memorial Scholarship Award Winner, a writing award sponsored by the Entertainment, Arts and Sports Law Section. This article appears in an upcoming issue of the Entertainment, Arts and Sports Law Journal. For more information, please visit NYSBA.ORG/EASL.
Endnotes
[1] Sarah Chun, Facial Recognition Technology: A Call for the Creation of a Framework Combining Government Regulation and a Commitment to Corporate Responsibility, 21 N.C. J.L. & Tech. 99, 101 (2020).
[2] S1422. Bill was reported and committed to the Internet and Technology Committee on May 6, 2025. https://www.nysenate.gov/legislation/bills/2025/S1422.
[3] Hafiz Sheikh Adnan Ahmed, Facial Recognition Technology and Privacy Concerns, ISACA (Dec. 21, 2022), https://www.isaca.org/resources/news-and-trends/newsletters/atisaca/2022/volume-51/facial-recognition-technology-and-privacy-concerns#:~:text=The%20most%20significant%20privacy%20implication,that%20are%20not%20lawfully%20constructed.
[4] Shaun Raviv, The Secret History of Facial Recognition, Wired (Jan. 21, 2020), https://www.wired.com/story/secret-history-facial-recognition/.
[5] Id.
[6] Id.
[7] Id.
[8] Id.
[9] Biometrics Used To Detect Criminals at Super Bowl, ABC News (Feb. 13, 2001), https://abcnews.go.com/Technology/story?id=98871&page=1#:~:text=T%20A%20M%20P%20A%2C%20Fla.%2C%20Feb.,a%20new%20kind%20of%20surveillance.
12 James Andrew Lewis and William Crumpler, How Does Facial Recognition Work?, Center for Strategic and International Studies (June 10, 2021), https://www.csis.org/analysis/how-does-facial-recognition-work.
[13] Susan McCoy, O’Big Brother Where Art Thou?: The Constitutional Use of Facial Recognition Technology, 20 J. Marshall J. Computer & Info. L. 471, 477 (2002).
[14] See Thorin Klosowski, Facial Recognition Is Everywhere. Here’s What We Can Do About It, N.Y. Times (July 15, 2020) https://www.nytimes.com/wirecutter/blog/how-facial-recognition-works/.
[15] See id.
[16] Katie Evans, Half of All American Adults Are in a Police Face Recognition Database, New Report Finds, Georgetown Law (Oct. 18, 2016), https://www.law.georgetown.edu/news/half-of-all-american-adults-are-in-a-police-face-recognition-database-new-report-finds/.
[17] See Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, N.Y. Times (Jan. 18, 2020), https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html.
[18] Max Zahn, Controversy Illuminates Rise of Facial Recognition in Private Sector, ABC News (Jan. 7, 2023) (stating the industry is expected to reach $16.7 billion by 2030), https://abcnews.go.com/Business/controversy-illuminates-rise-facial-recognition-private-sector/story?id=96116545.
[19] Gabrielle Canon, How Taylor Swift Showed Us the Scary Future of Facial Recognition, The Guardian (Feb. 15, 2019), https://www.theguardian.com/technology/2019/feb/15/how-taylor-swift-showed-us-the-scary-future-of-facial-recognition.
[20] Stefan Etienne, Taylor Swift Tracked Stalkers With Facial Recognition Tech at Her Concert, The Verge (Dec. 12, 2018), https://www.theverge.com/2018/12/12/18137984/taylor-swift-facial-recognition-tech-concert-attendees-stalkers.
[21] See Evan Ringel & Amanda Reid, Regulating Facial Recognition Technology: A Taxonomy of Regulatory Schemata and First Amendment Challenges, 28 Comm. L. & Pol’y 3, 3 (2023).
[22] Dustin Nelson, Taylor Swift’s Eras Tour Is First To Ever Make $1 Billion, Entertainment Weekly (Dec. 11, 2023), https://ew.com/taylor-swift-eras-tour-first-to-ever-make-1-billion-8413474#:~:text=Pollstar%20announced%20on%20Friday%20that,Making%20it%20even%20more%20impressive%3F.
[23] Patrick Reilly and Bernadette Hogan, NYC Lawmakers Introduce Bill To Ban Businesses From Using Facial Recognition Tech, New York Post (April 12, 2023), https://nypost.com/2023/04/12/nyc-lawmakers-introduce-bill-to-ban-businesses-from-using-facial-recognition-tech/.
[24] Id.
[25] Id.
[26] Hutcher v. Madison Sq. Garden Entertainment Corp., 2022 N.Y. Misc. LEXIS 8658.
[27] NY CLS Civ R § 40-b.
[28] See Hutcher at 573-74 (clarifying that MSG is a multi-purpose venue and will only fall under Civil Rights Law § 40-b when “it is being used for an enumerated purpose.”).
[29] See id. at 435.
[30] See Class Action Complaint (“Compl.”) ¶ 32, Aaron Gross v. Madison Square Garden Ent. Corp., Index. No. 651533/2023, NYSCEF Doc. No. 1.
[31] Id.
[32] Gross v. Madison Square Garden Ent. Corp., 23-CV-3380, 2024 U.S. Dist. LEXIS 4904, *30 (S.D.N.Y. Jan. 9, 2024).
[33] Gross v. Madison Square Garden Ent. Corp., 23-CV-3380, 2024 U.S. Dist. LEXIS 83102, *4 (S.D.N.Y. May 7, 2024) (“Nothing about “the way” that defendant benefits from the data sharing at issue here places it — in contrast to any other company’s sharing of biometric data — within the sweep of activity proscribed by Section 22-1202(b).”).
[34] Zach Williams, MSG Spins Dolan’s Facial Tech Meeting With NY Liquor Board as Voluntary – After Receiving Subpoena, New York Post (updated Feb. 9, 2023), https://nypost.com/2023/02/08/james-dolan-sits-with-nys-liquor-official-over-msgs-facial-recognition-policy/.
[35] Id.
[36] NYC Administrative Code 8-107.
[37] Fred Katz, James Dolan Doubles Down on Use of Facial Recognition at MSG in Latest Interview, The Athletic (Jan. 27, 2023), https://theathletic.com/4132393/2023/01/27/james-dolan-msg-facial-recognition-wfan/.
[38] See generally Matter of Madison Sq. Garden Entertainment Corp. v. New York State Liq. Auth., 221 A.D.3d 536.
[39] Linda Schmidt, Judge Throws Out MSG’s Lawsuit Against NY State Liquor Authority, Fox 5 New York (April 5, 2023), https://www.fox5ny.com/news/judge-throws-out-msgs-lawsuit-against-ny-state-liquor-authority.
[40] Alex Najibi, Racial Discrimination in Face Recognition Technology, Blog, Science Policy, Special Edition: Science Policy and Social Justice (Oct. 24, 2020),
https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/.
[41] Id.
[42] Clare Garvie, A Forensic Without the Science: Facial Recognition in U.S. Criminal Investigations, Geo. L. Ctr. on Privacy & Tech. 1, 15-16 (Dec. 6, 2022).
[43] Id. at 26.
[44] Id. at 10.
[45] Id. at 25.
[46] Id.
[47] Id. at 28. (“The presence of cognitive bias should be assumed, particularly in an investigative technique that relies on subjective judgment to the extent that face recognition does, as well as lacks any standardized user controls or protocols.”).
48 Kade Crockford, How Is Face Recognition Surveillance Technology Racist?, ACLU (June 16, 2020), https://www.aclu.org/nehttps://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racistws/privacy-technology/how-is-face-recognition-surveillance-technology-racist.
[49] Kashmir Hill, Facial Recognition Led to Wrongful Arrests. So Detroit Is Making Changes., The New York Times (June 29, 2024), https://www.nytimes.com/2024/06/29/technology/detroit-facial-recognition-false-arrests.html#:~:text=Williams%20said.-,Mr.,she%20was%20eight%20months%20pregnant (Mr. Williams, represented by the ACLU, later sued the Detroit Police Department. As part of a settlement, the police department vowed to do better.).
[50] See id. (“The others were in Louisiana, New Jersey, Maryland, and Texas.”).
[51] U.S. Commission on Civil Rights, The Civil Rights Implications of the Federal Use of Facial Recognition Technology (September 2024), https://www.usccr.gov/files/2024-09/civil-rights-implications-of-frt_0.pdf.
[52] Evan Ringel & Amanda Reid, Regulating Facial Recognition Technology: A Taxonomy of Regulatory Schemata and First Amendment Challenges, 28 Comm. L. & Pol’y 3, 3 (2023).
[53] 2007 ILL. ALS 994, 2007 Ill. Laws 994, 2007 ILL. P.A. 994, 2007 ILL. SB 2400.
[54] Id.
[55] Id.
[56] Cal Civ Code § 1798.140; NY CLS STATE TECHNOLOGY LAW § 106-b.
[62] Katja Kukielski, Developments in the Law: The First Amendment and Facial Recognition Technology, 55 Loy. L.A. L. Rev. 231, 275 (2022).
[64] See supra note 52 at 18.
66 See generally Kiran K. Jeevanjee, Nice Thought, Poor Execution: Why the Dormant Commerce Clause Precludes California’s CCPA From Setting National Privacy Law, 70 AM. U. L. Rev. F 75 (2020)
[66] See id. at 80 (arguing CCPA “violates the dormant commerce clause because it imposes an undue burden on non-California-based businesses by compelling them to comply with stringent requirements before collecting or processing California citizens’ personally identifiable information.”).
68 2019 N.Y. ALS 117, 2019 N.Y. Laws 117, 2019 N.Y. Ch. 117, 2019 N.Y. SB 5575.
71 NY CLS STATE TECHNOLOGY LAW § 106-b.
7272C Administrative Code 22-1202(b).
74 NYC Administrative Code 22-1205
75 Jacob Kastrenakes, Ticketmaster Could Replace Tickets With Facial Recognition, The Verge (May 7, 2018), https://www.theverge.com/2018/5/7/17329196/ticketmaster-facial-recognition-tickets-investment-blink-identity.
76 Patrick K. Lin, How To Save Face & The Fourth Amendment: Developing an Algorithmic Auditing and Accountability Industry for Facial Recognition Technology in Law Enforcement, 32 Alb. L.J. Sci. & Tech. 189, 190-91.
77 Kade Crockford, How Is Face Recognition Surveillance Technology Racist?, ACLU (June 16, 2020), https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist.
78[77] Ethan Millman, Tom Morello, Zack de la Rocha, and Boots Riley Boycotting Venues That Use Face-Scanning Technology, Rolling Stone (June 22, 2023), https://www.rollingstone.com/music/music-features/tom-morello-zack-de-la-rocha-facial-recognition-concerts-boycott-1234775909/ (The House of Yes in Brooklyn, NY, the Lyric Hyperion in Los Angeles, CA, and Black Cat in Washington D.C. all announced they will stop using FRT.).
79 Charlie Sorrel, Why Some Artists Boycott Venues That Use Facial Recognition Tech, Lifewire (June 28, 2023), https://www.lifewire.com/artists-boycott-venues-over-facial-recognition-7554471.
82 Ethan Millman, Taylor Swift’s Eras Tour Is the Highest-Grossing of All Time and First-Ever To Hit $1 Billion, Rolling Stone (Dec. 8, 2023), https://www.rollingstone.com/music/music-news/taylor-swift-eras-tour-highest-grossing-all-time-1-billion-1234921647/.
[82] See supra note 52 at 37.
[83] Id.