Tesla wins first US Autopilot trial involving fatal crash – Software


Tesla won the first US trial over allegations that its Autopilot driver assistant feature led to a death, a major victory for the automaker as it faces several other lawsuits and federal investigations related to the same technology.



The verdict represents Tesla’s second big win this year, in which juries have declined to find that its software was defective.

Tesla has been testing and rolling out its Autopilot and more advanced Full Self-Driving (FSD) system, which chief executive Elon Musk has touted as crucial to his company’s future but which has drawn regulatory and legal scrutiny.

The outcome in civil court shows Tesla’s arguments are gaining traction: when something goes wrong on the road, the ultimate responsibility rests with drivers.

The civil lawsuit filed in Riverside County Superior Court alleged the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour (105 km per hour), strike a palm tree and burst into flames, all in the span of seconds.

The 2019 crash killed Lee and seriously injured his two passengers, court documents show.

The trial involved gruesome testimony about the passengers’ injuries, and the plaintiffs asked the jury for US$400 million ($631 million) plus punitive damages.

Tesla denied liability, saying Lee consumed alcohol before getting behind the wheel.

The electric-vehicle maker also argued it was unclear whether Autopilot was engaged at the time of the crash.

The 12-member jury announced they found the vehicle did not have a manufacturing defect. The verdict came on the fourth day of deliberations, and the vote was 9-3.

Jonathan Michaels, an attorney for the plaintiffs, expressed disappointment in the verdict but said in a statement that Tesla was “pushed to its limits” during the trial.

“The jury’s prolonged deliberation suggests that the verdict still casts a shadow of uncertainty,” he said.

Tesla said its cars are well designed and make the roads safer. “The jury’s conclusion was the right one,” the company said in a statement.

Tesla won an earlier trial in Los Angeles in April with a strategy of saying it tells drivers that its technology requires human monitoring, despite the “Autopilot” and “Full Self-Driving” names.

That case was about an accident where a Model S swerved into the curb and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and driver distraction was to blame.

Bryant Walker Smith, a University of South Carolina law professor, said the outcome in both cases show “our juries are still really focused on the idea of a human in the driver’s seat being where the buck stops.”

At the same time, the Riverside case had unique steering issues, said Matthew Wansley, a former general counsel of nuTonomy, an automated driving startup, and associate professor at Cardozo School of Law.

In other lawsuits, plaintiffs have alleged Autopilot is defectively designed, leading drivers to misuse the system. The jury in Riverside, however, was only asked to evaluate whether a manufacturing defect impacted the steering.

“If I were a juror, I would find this confusing,” Wansley said.

During the Riverside trial, an attorney for the plaintiffs showed jurors a 2017 internal Tesla safety analysis identifying “incorrect steering command” as a defect, involving an “excessive” steering wheel angle.

A Tesla lawyer said the safety analysis did not identify a defect, but rather was intended to help the company address any issue that could theoretically arise with the vehicle.

The automaker subsequently engineered a system that prevents Autopilot from executing the turn which caused the crash.

On the stand, Tesla engineer Eloy Rubio Blanco rejected a plaintiff lawyer’s suggestion that the company named its driver-assistant feature “Full Self-Driving” because it wanted people to believe that its systems had more abilities than was really the case.

“Do I think our drivers think that our vehicles are autonomous? No,” Rubio said, according to a trial transcript seen by Reuters.

Tesla is facing a criminal probe by the US Department of Justice over claims its vehicles can drive themselves.

In addition, the US National Highway Traffic Safety Administration has been investigating the performance of Autopilot after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles.

Guidehouse Insights analyst Sam Abuelsamid said Tesla’s disclaimers give the company powerful defenses in a civil case.

“I think that anyone is going to have a hard time beating Tesla in court on a liability claim,” he said. “This is something that needs to be addressed by regulators.”



Source link