Autonomous vehicle design involves an almost incomprehensible combination of engineering tasks including sensor fusion, path planning, and predictive modeling of human behavior. But despite the best efforts to consider all possible real world outcomes, things can go awry.

More than two and a half years ago, in Tempe, Arizona, an Uber “self-driving” car crashed into pedestrian Elaine Herzberg, killing her. In mid-September, the safety driver behind the wheel of that car, Rafaela Vasquez, was charged with negligent homicide.

Uber’s test vehicle was driving 39 mph when it struck Herzberg. Uber’s sensors detected her six seconds before impact but determined that the object sensed was a false positive. Uber’s engineers tuned the software to be less sensitive to unidentified objects in order to achieve a smoother ride. Uber also disabled the vehicle’s factory-installed automatic emergency braking system, which likely would have prevented the accident, in order to accurately test the capability of its own automated driving system.

But Uber is not a defendant in this case. Prosecutors essentially have unchecked discretion over what criminal charges they file and against whom, and an Arizona county prosecutor previously declined to file criminal charges against Uber in Herzberg’s death. So while Vasquez may pursue a defense strategy of “putting Uber on trial,” the company will neither be a party to the criminal case nor play any active role in the court proceedings.

To be clear, Vasquez does not seem to be blameless. In-car video shows her looking down at what appears to be a cellphone prior to the collision. (Tempe PD later confirmed that her cellphone was streaming an episode of The Voice at the time of the crash.) And while the road visibility conditions are still in dispute, Vasquez did not attempt to brake until after impact.

However, Vasquez claims she was not distracted and stated in an interview with the National Transportation Safety Board that she was monitoring the vehicle’s interface prior to the crash. Regardless, even having her cellphone on while the car was in operation was a violation of Uber’s safety protocols. And the NTSB ultimately found that the probable cause of the accident was the failure of the vehicle operator to monitor the driving environment and the operation of the automated driving system.

Still, the decision to use criminal sanctions against only the backup driver in this case is legally, morally, and politically problematic.

For one thing, the NTSB findings are factually inconsistent with the Maricopa County prosecutor’s apparent considerations for criminal charges. First, “probable cause” as used by NTSB is more properly understood to mean the most immediate and proximate cause—not as a statement of moral or criminal culpability. Further, the NTSB cited as contributing factors Uber’s inadequate safety risk assessment procedures, ineffective oversight of vehicle operators, and lack of mechanisms for addressing automation complacency. The NTSB report is clear that Elaine Herzberg’s death was in no small part a consequence of Uber’s inadequate safety culture.

The charging decisions by the Arizona county prosecutors ignore the complex set of contributory factors in the NTSB’s report. Instead of grappling with those nuances, they appear to have elected to pursue an easy target in the name of hollow accountability.

The lives of consumers and workers are increasingly defined by automated systems, from industrial robots to social media algorithms. It is therefore imperative that we critically evaluate who will bear moral and legal responsibility when humans and robots share control of complex systems that will inevitably malfunction. This is a clear example of what anthropologist Madeleine Clare Elish calls the “moral crumple zone,” the phenomena where responsibility for an action is misattributed to a human actor who has limited control over the behavior of an autonomous system.

The decision to clear Uber of criminal liability in Herzberg’s death was no surprise. Local prosecution of corporate entities is exceedingly rare, particularly in situations like this where a single criminal act is hard to identify, let alone prove beyond a reasonable doubt. When corporations do face criminal sanctions, they are typically prosecuted through state or federal attorneys general. Those cases often involve financial malfeasance, and the penalties involved are almost always in the form of fines or internal policy changes.

In civil matters, employers are responsible for the acts of their employees through the doctrine of respondeat superior, a Latin phrase meaning “let the master answer.” Holding corporations vicariously liable for their employees’ actions derives from old English common law doctrine that sought to keep masters liable for the actions of their servants.

Respondeat superior is based on the logic that employers benefit from the agency relationship and are therefore in a position to exercise control over how their employees conduct their business. Perhaps more importantly in the context of civil lawsuits is that corporations with “deep pockets” are able to pay monetary damages whereas employees often do not have the means to satisfy large money judgments, rendering them meaningless. In this case, Uber rushed to negotiate a settlement with Herzberg’s heirs, reaching an agreement within days of her passing.

But in criminal law, there is no real corollary to the doctrine of vicarious liability. Corporations can’t go to jail. And it is often difficult to identify a single culpable actor in these cases even when a criminal act has been committed. So when prosecutors need a target in the name of public accountability, it is the ”servants” who often pay the price.

In justifying the decision to charge Vasquez, Maricopa County Attorney Allister Adel took a moral tone, focusing on the dangers of distracted driving. In a prepared statement, she said, “When a driver gets behind the wheel of a car, they have a responsibility to control and operate that vehicle safely and in a law-abiding manner.” But this rationale is dubious in light of previous actions and comments made by Arizona government officials.

Only days after the crash, Tempe Police Chief Sylvia Moir appeared to prematurely conclude that Uber was not at fault, calling the accident “unavoidable.” Without a full investigation having been completed, Moir instead shifted blame to the victim, stating, “It’s very clear it would have been difficult to avoid this collision in any kind of mode based on how [Herzberg] came from the shadows right into the roadway.”

Especially troubling in this context are the comments of Arizona’s Gov. Doug Ducey prior to Uber’s autonomous vehicle program deployment. In actively courting Uber to test its vehicles on Arizona’s “wide open roads,” the libertarian-leaning Ducey touted the state’s limited regulatory environment.

Notably, the NTSB determined that the Arizona Department of Transportation’s insufficient oversight of automated vehicle testing was among the causal factors leading to the Uber crash.  Uber’s self-driving operation in Arizona is shut down, but the industry is still alive and well in Arizona: Waymo recently announced that it would deploy the first-ever autonomous ride-hailing service in the suburbs of Phoenix.

The case against Vasquez sends a message that when robots and other automated systems fail, the corporations that deploy these technologies from a distance will be absolved while low-level employees who are physically closest to the equipment will suffer the consequences.

Vasquez was hired by Uber as a contractor. And while the world of driverless cars grabs big headlines and big investments, the job of a safety backup driver is not a sexy high-tech gig. Other former Uber backup operators described the job as causing exhaustion and boredom, and said that workers were subject to complacency. Automation complacency is a known phenomenon in which humans supervising a highly automated system naturally become less engaged and inattentive over time.

The fact that this was a first-of-its-kind event is more reason that prosecutors should have acted with caution. Legally speaking, the case will present unique issues. Otherwise common terms like drivingoperator, and control will be subject to intense debate. In Arizona, criminal negligence requires that a person’s actions constitute “a gross deviation from the standard of care that a reasonable person would observe in the situation.” But the “reasonable person” standard is difficult to assess in a unique situation like this, which few people can appreciate.

In various legal cases, individuals are designated to stand in on behalf of businesses or other entities and represent their interest. Now that Uber has settled its civil lawsuit, and the state of Arizona and NTSB have completed their investigations into Herzberg’s death, Rafaela Vasquez is the last person left to blame. Regardless of the outcome of her criminal case, it’s fair to say that Uber’s self-driving software wasn’t the only institutional failure.

Reprinted From:

https://slate.com/technology/2020/10/uber-self-driving-car-death-arizona-vs-vasquez.html