Do Robot Cars Dream of Legal Liability?

By: Mason Hudon

In the next decade, driverless cars will likely become commonplace on American streets. Promising reduced fatalities from car crashes, more time to engage in work or play for the occupants, and potentially more streamlined and efficient transit systems, so-called autonomous vehicles (AVs) have captured the minds and hearts of those who dream of a better tomorrow. Inevitably, however, AVs are not without their critics, and in the wake of serious pedestrian and operator casualties in places like Arizona, Taiwan, and Florida, legal professionals are burdened with the difficult question: whose fault is it when a vehicle without a driver causes death or injury and consequently provides grounds for a lawsuit in tort or criminal charges?

The answer, unfortunately, is complicated. For standard non-AV vehicles, liability for car accidents is relatively simple in both criminal and civil actions. The tortfeasor or lawbreaker is easily identifiable and charged; it is the person behind the wheel, actually operating the vehicle and making decisions that have led to serious consequences. In contrast, a driverless car accident may be due to operator error, automated vehicle error, or a combination of both. Because of these complexities, lawmaking concerning the topic has lagged behind tech innovations, and both prosecutors and civil lawyers representing plaintiffs have, up until now, taken the path of least resistance.

For example, in 2018, during routine testing in Maricopa County, Arizona, an Uber-owned autonomous vehicle operated by Uber contractor Rafaela Vasquez hit and killed a pedestrian walking her bike across a roadway at night. “In the six seconds before impact, the self-driving system classified the pedestrian as an unknown object, then as a vehicle, and then as a bicycle,” and ultimately relied on Vasquez to activate emergency brakes. By the time Vasquez, who had been apparently watching The Voice on her cellphone, was informed by the vehicle that a collision was imminent, it was far too late for her to react. In fact, a National Transportation Safety Board Report found that the Uber vehicle informed Vasquez only 1.3 seconds before impact that a collision was imminent. It was only then that the car software allowed for the driver to use emergency braking maneuvers, which had been deactivated for the majority of Vasquez’s trip that day. She was subsequently determined to be criminally liable under a theory of negligence, but many critics decried the fact that she was charged for putting her trust in software that should have been programmed to handle the situation for her, with Jesse Halfon for Slate.com writing, “the decision to use criminal sanctions against only the backup driver in this case is legally, morally, and politically problematic.”

When asked why Uber was not charged in this situation, University of Washington School of Law Professor Ryan Calo opined that in lieu of clearer guidelines for charging driverless car crimes, the prosecutors opted for a simple, relatable story that a jury could easily understand. “That’s a simple story, that her negligence was the cause of [the pedestrian’s] death,” Calo said, “[b]ring a case against the company, and you have to tell a more complicated story about how driverless cars work and what Uber did wrong.” Additionally, University of South Carolina Law Professor Bryant Walker Smith commented, “[…]I’m not sure it tells us much about the criminal, much less civil, liability of automated driving developers in future incidents.” The question thus remains, did Uber actually do anything wrong here and “escape” criminal liability? It appears so.

According to the National Transportation Safety Board’s final report on the collision, while the vehicle operator was, in fact, the probable cause of the accident, “[c]ontributing to the crash were the Uber Advanced Technologies Group’s (1) inadequate safety risk assessment procedures, (2) ineffective oversight of vehicle operators, and (3) lack of adequate mechanisms for addressing operators’ automation complacency—all a consequence of its inadequate safety culture.” But, “[i]nstead of grappling with those nuances, [the prosecutors] appear to have elected to pursue an easy target in the name of hollow accountability.”

Many also question the sentiment espoused by a Maricopa County prosecuting attorney that a driver has “a responsibility to control and operate [a] vehicle safely and in a law-abiding manner” when the term “driverless car” is discussed. The entire point of an AV is to allow the driver to disengage and attend to other matters like work, gaming, resting, or engaging with other passengers. Not to mention that studies have shown operators of driverless cars are prone to vigilance decrement, a significant decrease in alertness, after only 21 minutes of autonomous vehicle operation due to boredom and a lack of stimulation. If drivers still have to worry about liability for car accidents when they are not even operating a vehicle, then the viability of AVs may be seriously called into question. A partial solution, however, may be found in product liability jurisprudence for tort actions.

As of 2015, “Florida, Nevada, Michigan, and the District of Columbia [had] enacted statutes limiting product liability actions against [Original Equipment Manufacturers (OEMs)] when the action [was] based upon a defect in an autonomous vehicle.” The statutes generally provide that “OEMs are not liable for defects in an autonomous vehicle if the defect was caused when the original vehicle was converted by a third party into an autonomous vehicle or if equipment installed by the autonomous vehicle creator was defective.” In the Maricopa County case, however, Uber was able to settle out of court with the decedent’s family and escape criminal retribution entirely, while its contractor seemingly took the fall for the criminal aspect of the accident, even though she was informed of an imminent collision only 1.3 seconds before the crash occurred.

The emergence of AVs on our roads will require not only greater guidance from legislatures in charging and holding companies like Uber accountable through statutes, but it will also necessitate a revaluation of existing tort law and criminal liability standards. A 2014 Brookings Institute report focusing on AV development in the long run asserted, “federal attention to safety standards for autonomous vehicles will be needed, and those standards will have [to have] liability implications.” Such federal safety standards may take the form of mandatory operator accessible emergency brakes, a requirement that operators maintain their vehicle to a certain standard to use autonomous features, or even only allowing AV use on certain qualified roads with few to no crosswalks. These potential standards, however, will undoubtedly have to be extensively tested before they can be implemented and incorporated into a federal regulatory scheme, likely requiring states to pick up the slack in the interim. In moving forward, states should consider situations like that which occurred in Maricopa County, Arizona and ensure that they are holding the right entity accountable, especially when it comes to criminal sanctions, rather than simply taking the easy way out.

Leave a comment