Who (or What) Is Liable for AI Risks?

By Vivian D. Wesson

The Growing Use of Artificial Intelligence Applications

On a clear, bright day in early May four years ago, Joshua Brown cruised down a Florida highway, relaxed behind the wheel of his Tesla Model S set to autopilot mode. As Mr. Brown approached a highway intersection, a white 18-wheel tractor-trailer was in the process of making a left-hand turn. Nearing this intersection, Mr. Brown’s Tesla did not reduce speed or attempt to stop. The Tesla proceeded as if the intersection was clear, maintaining its current speed. The collision resulted in Mr. Brown’s death, the first autopilot fatality in 130 million miles of Tesla-driven vehicles.[1] Why? Although this Level 3[2] autonomous vehicle had performed safely under most driving conditions, its sensors failed to distinguish the large white truck from the bright spring sky. The Tesla did not “see” the truck and, therefore, continued its normal operation.

The risk of fatality from operating a motor vehicle is not new. In fact, the National Highway Traffic Safety Administration reported 36,560 motor vehicle deaths in 2018 alone.[3] Autonomous vehicles, though, pose a new risk due to the technology involved. In a driverless vehicle, whose duty is it to anticipate and prevent accidents: the auto manufacturer, the sensor producer, the software engineer, or the driver? Who should bear liability for this risk? Can these new risks be insured and at what price?

Revolutions in computing power, distributed computing, data storage, and data science have fostered the next generation of artificial intelligence (AI) technologies, including natural language processing, robotics, blockchain, and the Internet of Things. Tesla’s autonomous car is really a platform of numerous experimental AI technologies. The error that caused Mr. Brown’s collision was as much a machine learning data training fault as a car accident.

The question of “Who is liable?” in the digital age is a good one and raises several insurance implications. For a self-driving car, is it the driver, the manufacturer, the provider of the training data, or any of the numerous academics, software engineers, programmers, universities, or open source libraries that contributed to the algorithm that is liable? In evaluating the new risks posed by digital technological marvels such as self-driving cars, insurance underwriters face a daunting challenge. On the surface, the risks seem very familiar, but upon closer inspection, present themselves differently from technological, physical, social, and legal perspectives.

Autonomous Vehicles

Statistically, autonomous vehicles (AV) are safer than driver-controlled vehicles. Very few fatalities have been publicized to date.[4] According to a 2018 report from Axios, the lion’s share of self-driving vehicle accidents was caused by vehicles with motorists operating the wheel.[5] Notwithstanding the data, AV-related accidents are more sensationalized and perceived as risker when a human driver is not controlling the vehicle.[6]

Sensationalism aside, the responsibility for risks that AVs impose – personal injury, death, and property damage – must rest with someone (the driver) or some party (the manufacturer). As AVs approach Level 5 autonomy, it has been suggested that human driver responsibility may be completely removed from the risk equation.[7] Could this signal a shift of liability to the AV manufacturer alone? Would the AV manufacturer now require private passenger auto insurance in addition to products liability and commercial general liability? What about the third-party owned sensors and software deployed in the AV or the AV’s satellite access? Does the AV manufacturer’s responsibility extend to those features, which would likely trigger a need for cybersecurity insurance as well?

In analyzing these exposures, insurers will have to consider the likelihood and magnitude of these risks. Contractual allocations of risk among the AV’s component owners will also need to be factored. As with other new exposures (e.g., terrorism), regulation will become part of the equation.[8] New York’s AV law requires that: (i) a natural person holding a valid driver’s license be present in the AV while it is operated on any public highways; and (ii) any AV utilized in demonstrations and tests on public roads has in place at least $5 million of financial security.[9] “Financial security” used here generally translates to some form of insurance, whether surety bonds, third-party insurance, or self-insurance.[10]

Robotics

In January 2016, the World Economic Forum predicted that more than five million jobs could be lost to robots in some developed and emerging economies by 2021.[11] A more recent Oxford Economics report projected that many “millions of additional manufacturing jobs are likely to be displaced by robots by 2030.”[12] However, job displacement due to automation is not limited to the blue collar sector. Automation will impact white collar occupations as well, including insurance sales representatives.[13]

Although “repetitive, standardized tasks of many current jobs will be soon all be done by robots and algorithms,”[14] task automation will generate new jobs, some of which are not currently envisioned.[15] As the insurance industry evolves to overcome Baumol’s disease (the “difficulty of introducing automated methodologies to human-based intellectual activities[, which] has long been recognized as a root cause of stagnant productivity in many industries”),[16] it must also focus on new hazards that the robotic revolution generates. First, insurers will need to define what a “robot” means within an insurance policy – not an easy task when there is no set definition.[17] Secondly, the question of liability assessment arises – is each contributor (i.e., manufacturers, software designers, or operators) liable for any defective products manufactured? In addition, insurers will need to question whether the employer correctly followed instructions to install and operate the robot – raising the potential for employer’s liability claims for failure to address workplace safety.[18]

Additive Manufacturing & 3D Printing

One of the fastest growing digital technology fields involves 3D printing, also known commercially as “additive manufacturing.” The 3D printing process synthesizes a three-dimensional object by forming successive layers of material with the aid of a computer. This technology has been deployed in various industries, including automotive, aerospace, construction, food, and health care. For most manufactured goods, strict liability applies if the product is defectively designed or manufactured or if the manufacturer failed to warn the consumer about using the product. Would strict liability also apply to the manufacturer of a 3D-printed good? What about the manufacturer of the 3D printer itself? Is the software designer of the CAD[19] file also liable? Should liability extend to the supplier of the material used to create the 3D-printed object?

Insurance risks can be substantial for those involved in additive manufacturing. Compounding the known risks, the foreseeable misuse doctrine can give rise to indefinite layers of product liability. Under the foreseeable misuse doctrine, a manufacturer is liable for the foreseeable uses of its products. With 3D printing, the range of foreseeable uses will undoubtedly evolve over time. Future litigation may even center around products that should not have been manufactured but for the capability of 3D printing.[20]

While 3D printing may democratize goods manufacturing, it may also lead to substantial disruption of retail manufacturing, resulting in extensive job loss for both manufacturers and laborers.[21] For the insurance industry, the increased risks from lack of regulation and product liability may necessitate a change in how these exposures are underwritten. On the flip side, having insureds with 3D printing capability could lead to a more efficient claims process if the policyholder can generate the parts needed for replacement or repair.[22] In any event, those involved in additive manufacturing should review their coverages for business interruption, cybersecurity, intellectual property, and workers’ compensation,[23] in addition to product liability.

Artificial Intelligence

Recent advances in AI include self-driving cars, drones, robotics, legged locomotion, autonomous planning and scheduling, machine translation, speech recognition, recommendations, game playing, imaging understanding, medicine, and climate science.[24] The umbrella term “artificial intelligence” means different things depending on the user. Academics and technologists may use the term “AI” when referring to machine learning algorithms, deep learning, neural networks, and/or generative adversarial networks.[25] At the 2016 World Economic Forum Annual Meeting in Davos, AI was labeled as the “fourth industrial revolution.”[26] AI has indeed revolutionized almost every aspect of our lives – from education and research to travel and entertainment. AI, in its numerous manifestations, has transformed financial services and is beginning to transform insurance company operations, claims handling, underwriting, marketing, distribution, and sales.[27]

The current digital technological revolution and the products and services that rely on AI owe much of their genesis to increased computing power, distributed computing, the ubiquitous use of sensors, the explosion of available data, and a substantial decline in data storage costs – specifically, cloud storage.[28] The ephemeral term cloud storage is a misnomer. “Cloud” computing and storage occur within a pool of land-based servers owned by a third party that are accessed through the internet.[29] While cloud storage has a number of benefits,[30] it also involves new risks for the cloud user and cloud provider. What happens if a data breach or denial of service attack occurs at the cloud computing site? Who should be liable if the cloud user is unable to access its data or applications?

These new exposures have challenged the insurance industry, especially when trying to evaluate the new cyber-related risks. The cyber peril is unique and presents an underwriting challenge. Hackers have arisen globally from nation states, cyber militias, criminal cartels, independent organizations, terrorist groups, and talented private individuals. They take advantage of unwary users, disgruntled employees, errors and faults in the software code, software obsolescence, technology maturity, and manufacturers’ inattention to cybersecurity. The hacker community also has access to “grey” or “black” organizations that rent, sell, and support an array of tools, products, and services that make hacking more effective. In addition, cyberattacks are fast, occur in real-time, and are difficult to detect. In April 2020, Self-Key reported that at least 8 billion records, including “credit card numbers, home addresses, phone numbers and other highly sensitive information, have been exposed”[31] over the previous 15 months.

Exacerbating the issue, during the transition from the Third Industrial Revolution to the Fourth Digital Revolution, insurance policies predicated on a 20th century industrial and commercial business model did not contemplate the risks associated with a 21st century digital world or “hacking” as a transnational catastrophic risk. Recently, some courts have held that insurance policies without explicit cyber coverage (or explicit cyber exclusions) would cover claims for privacy and data breach – the “silent” or non-affirmative cyber insurance issue.[32]

In London, insurers have convened a panel to study “silent” cyber clauses in insurance contracts.[33] Regulators are also wading into this space, enacting cybersecurity and data privacy laws. The National Association of Insurance Commissioners recently adopted the Insurance Data Security Model Law.[34] In 2017, New York enacted the NYDFS Cyber Security Regulation.[35] Although the New York law does not specify insurance as required in the cybersecurity program of a “Covered Entity,” many entities will rely on insurance to assist in recovery from a cybersecurity event.[36]

Conclusion

“Technology is a threat and an opportunity, a rival and a partner, a foe and a friend.”[37] While self-driving cars can mean the end of vehicular fatalities, the risk that these cars are subject to cyberattacks remains. Robots may greatly simplify and increase the production of goods but raise questions as to responsibility for potentially harmful products. Additive manufacturing can reduce dependence on imported goods but may spur unregulated weapon production, resulting in increased personal injury or death. Cloud storage may solve a company’s IT capacity constraints but exposes the company to ransomware attacks and potential data loss. As technology continues to develop, new and unforeseen risks will arise. For insurers, such new risks will continue to present novel underwriting challenges.

Vivian D. Wesson serves as Chief Intellectual Property Counsel to Marsh & McLennan Companies, for which she manages and protects the IP assets of its businesses, as well as advises on data strategy. Ms. Wesson is also Chair of the New York State Bar Association’s Committee on Attorney Professionalism and its Technology Subcommittee.


[1] Danny Yadron & Dan Tynan, Tesla driver dies in first fatal crash while using autopilot mode, The Guardian (Jun. 30, 2016).

[2] The Society of Automotive Engineers has created a six-level scale, from zero to five, to classify a vehicle’s autonomy, which scale has been adopted by the National Highway Traffic Safety Administration. See U.S. Dep’t of Transp., Automated Vehicles for Safety, https://www.nhtsa.gov/technology-innovation/automated-vehicles#topic-road-self-driving      .

[3] Although this represents a 2.4 percent decline from 2017 fatalities, there have been back-to-back increases in U.S. roadway deaths in recent years. See U.S. Dep’t of Transp., Nat’l High. Traffic Safety Admin., 2018 Fatal Motor Vehicle Crashes: Overview (Oct. 2019).

[4] See Yadron, supra note 1; see also Sara Ashley O’Brien, Uber operator in fatal self-driving vehicle crash was likely streaming ‘The Voice,’ CNN (Jun. 22, 2018, 12:07 PM), https://money.cnn.com/2018/06/22/technology/uber-self-driving-crash-police-report/index.html and Janelle Shane, You Look Like a Thing and I Love You 58-59 (2019).

[5] Kia Kokalitcheva, People cause most California autonomous vehicle accidents, Axios (Aug. 29, 2018), https://www.axios.com/california-people-cause-most-autonomous-vehicle-accidents-dc962265-c9bb-4b00-ae97-50427f6bc936.html.

[6] See Complaint, Oscar Willhelm Nilsson v. General Motors LLC, No. 3:18-cv-00471 (N.D. CA) (2018) (complaint involved a motorist struck by a Chevy Bolt Cruise Automation car, alleging negligence on the part of the vehicle and the driver); see also Geoffrey Wyatt, Manufacturers Won’t Bear All Liability For Driverless Vehicles, Law360 (Aug. 26, 2019, 3:40 PM), < https://www.law360.com/articles/1191712/print?section=cybersecurity-privacy>.

[7] See Peter H. Diamandis & Steve Kotler, The Future is Faster Than You Think: How Converging Technologies are Transforming Business, Industries, and Our Lives 184 (2020) (“If you’re riding in an autonomous car as a service, and there is no driver, do you need insurance?”).

[8] To date, twenty-nine states and Washington D.C. have enacted legislation related to AVs (states include Alabama, Arkansas, California, Colorado, Connecticut, Florida, Georgia, Illinois, Indiana, Kentucky, Louisiana, Maine, Michigan, Mississippi, Nebraska, New York, Nevada, North Carolina, North Dakota, Oregon, Pennsylvania, South Carolina, Tennessee, Texas, Utah, Virginia, Vermont, Washington, and Wisconsin). See Nat’l Conf. of State Legis., Autonomous Vehicles: Self-Driving Vehicles Enacted Legislation (Feb. 18, 2020), https://www.ncsl.org/research/transportation/autonomous-vehicles-self-driving-vehicles-enacted-legislation.aspx (last visited Apr. 9, 2020). See also U.S. Dep’t of Transp., Nat’l High. Traffic Safety Admin., Automated Driving Systems 2.0: A Vision for Safety, at ii (Sep. 2017) (guidelines introduced to “support the safe introduction of automation technologies”).

[9] S.B. 2005-C (N.Y. 2017). Note that, in New York, the minimum amount of liability coverage to operate a motor vehicle is: (a) $25,000 for bodily injury and $50,000 for death of one person in an accident; (b) $50,000 for bodily injury and $100,000 for death of two or more people in an accident; and (c) $10,000 for property damage; all considerably less than the $5,000,000 required for AVs.

[10] Id.

[11] James Manyika et al., McKinsey Global Inst., A Future That Works: Automation, Employment, And Productivity 29 (Jan. 2017). But see Oxford Economics, How Robots Change the World: What Automation Really Means for Jobs and Productivity 19 (Jun. 2019), http://resources.oxfordeconomics.com/how-robots-change-the-world?source=recent-releases.pdf (“the value created by robots across the economy more than offsets their disruptive impact on employment”) and Shane, supra note 4, at 220 (“[I]t’s unlikely that AI-powered automation will be the end of human labor as we know it. A far more likely vision for the future, even one with widespread use of advanced AI technology, is one in which AI and humans collaborate to solve problems and speed up repetitive tasks.”).

[12] Oxford Economics, supra note 11, at 21.

[13] Manyika et al., supra note 11, at 32.

[14] Anastassia Lauterbach & Andrea Bonime-Blanc, The Artificial Intelligence Imperative: A Practical Roadmap for Business 225 (2018).

[15] See Terrence J. Sejnowski, The Deep Learning Revolution 22 (2018) (“[A]s jobs that now require human cognitive skills are taken over by automated AI systems, there will be new jobs for those who create and maintain these systems.”).

[16] Richard D’Aveni, The Pan-Industrial Revolution: How New Manufacturing Titans Will Transform the World 91 (2018).

[17] Swiss Re, The robots are here: what that means for insurers (2017), https://www.swissre.com/dam/jcr:d0c55abb-3e1a-4bfd-8fe9-e6bf914a5184/2017_11_TechRobots_trend_spotlight.pdf.

[18] Id.

[19] “CAD” means “computer-aided design.” In 3D printing, the CAD files could be considered the entire product itself. See Tabrez Y. Ebrahim, 3D Printing: Digital Infringement & Digital Regulation, 14 Nw. J. Tech. & Intell. Prop. 37 (2016), https://scholarlycommons.law.northwestern.edu/njtip/vol14/iss1/2.

[20] Andy Crowder & Alex Fenner, What You Need to Know About 3-D Printing & Product Liability (Apr. 19, 2017).

[21] D’Aveni, supra note 16, at 169.

[22] Cindy Donaldson & Rick Morgan, How Will 3D Printing Impact Insurance?, IA Mag. (Aug. 23, 2017), https://www.iamagazine.com/strategies/read/2017/08/23/how-will-3d-printing-impact-insurance.

[23] In its 3D marketing report, insurance carrier Zurich noted that a 3D printing manufacturer could be exposed to workers’ compensation claims, because new materials in the 3D printing process, such as powdered metals like chromium and formaldehyde, present exposures to workers. Further, “high heat sources used in the process and toxic fumes that are being emitted from melting and decomposition of materials could also be sources of worker health issues and claims.” Zurich Am. Ins. Co., The Disruptive Technology of 3D Printing: Could It Disrupt Your Business Risk? 5 (2016), https://www.zurichna.com/-/media/project/zwp/zna/docs/kh/3dprinting/whitepaper_the-disruptive-technology-of-3d-printing.pdf?la=en.

[24] Stuart J. Russell & Peter Norvig, Artificial Intelligence: A Modern Approach 28-30 (4th ed. 2021).

[25] Shane, supra note 4, at 8. See also Sejnowski, supra note 15, at 135-137.

[26] Noa Gafni, Davos 2016: Where will the fourth industrial revolution impact us most?, Think at London Business Sch. (Jan. 27, 2016), https://www.london.edu/think/davos-2016-fourth-industrial-revolution.

[27] Lauterbach & Bonime-Blanc, supra note 14, at 32.

[28] Nick Polson & James Scott, AIQ: How People and Machines are Smarter Together 6 (2018).

[29] Amazon Web Services (AWS) describes its cloud storage offering as follows: “Cloud storage is a cloud computing model that stores data on the Internet through a cloud computing provider who manages and operates data storage as a service. It’s delivered on demand with just-in-time capacity and costs, and eliminates buying and managing your own data storage infrastructure. This gives you agility, global scale and durability, with ‘anytime, anywhere’ data access.” Amazon Web Serv., Inc., Cloud Storage, https://aws.amazon.com/what-is-cloud-storage/ (last visited Apr. 25, 2020).

[30] Id. (AWS notes the following benefits from cloud storage use: (1) lowering the total cost of ownership; (2) accelerating the time of IT solution deployment; and (3) centralizing information management).

[31] Selfkey, All Data Breaches in 2019 & 2020 – An Alarming Timeline, Blog (May 7, 2020), https://selfkey.org/data-breaches-in-2019.

[32] See Scott Godes et.al, The Cloud: Selected Benefits, Risks and Insurance Coverage Issues, Cloud Computing 2018: Key Issues & Practical Guidance (Feb. 26, 2018).

[33] See Martin Croucher, Panel Studies ‘Silent’ Cyber Clauses in Insurance Contracts, Law360 (Mar. 16, 2020), https://www.law360.com/articles/1253507/print?section=insurance-uk.

[34] Nat’l Ass’n of Ins. Comm’rs & The Ctr. for Ins. Pol’y & Res., The NAIC Insurance Data Security Model Law, State Legis. Brief (Dec. 2019). To date, only 8 states have adopted this model law: Alabama, Connecticut, Delaware, Michigan, Mississippi, New Hampshire, Ohio, and South Carolina.

[35] See N.Y. Comp. Codes R. & Regs. tit. 23, § 500 (2017) for the New York Department of Financial Security’s Cyber Security Regulation.

[36] Id. at § 500.02.

[37] Daniel Susskind, A World Without Work: Technology, Automation, and How We Should Respond loc. Ch. 1 (2020) (ebook).

Six diverse people sitting holding signs
gradient circle (purple) gradient circle (green)

Join NYSBA

My NYSBA Account

My NYSBA Account