Tesla (TSLA.O) won the first trial in the U.S. over claims that its Autopilot driver assistance feature caused a death on Tuesday. This is a big win for the car company, which is also facing other lawsuits and government investigations over the same technology.
This is Tesla’s second big win this year, as jurors have refused to find that its software was broken in both cases. Tesla has been testing and releasing its Autopilot and more advanced Full Self-Driving (FSD) system. Elon Musk, the CEO of Tesla, has said that these systems are important to the company’s future, but they have also come under review from regulators and the law.
The result in civil court shows that Tesla’s points are being taken seriously: drivers are ultimately responsible when something goes wrong on the road.
In a civil lawsuit brought in Riverside County Superior Court, Micah Lee’s Model 3 was said to have suddenly gone off a highway east of Los Angeles at 65 miles per hour (105 km per hour), hit a palm tree, and catch fire. All of this happened in a matter of seconds.
The 2019 crash killed Lee and badly hurt his two passengers, including an 8-year-old boy who was thrown out of his body. At the trial, horrible details about how the passengers were hurt were shown, and the plaintiffs asked the jurors for $400 million in damages plus punitive damages.
Tesla said Lee wasn’t responsible because he was drunk before getting behind the wheel. The company that makes electric cars also said it wasn’t clear if Autopilot was on at the time of the crash.
The 12-person jury said they didn’t think the car had a problem with how it was made. The decision was made after four days of talks. The vote was 9-3.
The lawyers for the plaintiffs, Jonathan Michaels, were upset with the decision, but they said in a statement that the trial “pushed Tesla to its limits.”
“The jury’s prolonged deliberation suggests that the verdict still casts a shadow of uncertainty,” he stated.
Tesla said that its cars are well-made and make the roads safer. “What the jury found was correct,” the business said in a statement.
Tesla won a previous trial in April in Los Angeles by arguing that, despite the names “Autopilot” and “Full Self-Driving,” it informs drivers that its technology requires human supervision.
In that case, a Model S crashed into a curb and hurt its driver. After the decision, jurors told Reuters that they thought Tesla should have warned drivers about its system and that the driver’s distraction was to blame.
U.S. law expert Bryant Walker Smith said that the results of both cases show that “our juries are still really focused on the idea that a human in the driver’s seat being where the buck stops.”
Matthew Wansley, an associate professor at Cardozo School of Law and former general counsel of nuTonomy, a company that makes self-driving cars, said that the Riverside case had its own steering problems.
In other lawsuits, plaintiffs have said Autopilot was badly built, which allows drivers to abuse the system. In Riverside, on the other hand, the jury was only asked to decide if a flaw in the making affected the steering.
Wansley said, “If I were a juror, this would make no sense to me.”
After going up more than 2%, Tesla stock ended the day up 1.76 percent.
In the Riverside trial, a lawyer for the plaintiffs showed the jury an internal 2017 Tesla safety report that listed “incorrect steering command” as a defect, which meant the steering wheel was tilted too far.
A lawyer for Tesla said that the safety study did not find a flaw in the car but was meant to help the company fix any problem that might happen with it. After the accident, the car company made a system that stops Autopilot from making the turn that caused the crash.
A lawyer for the plaintiff said that Tesla called its driver-assistant feature “Full Self-Driving” because the company wanted people to think that its systems could do more than they actually could. But Tesla engineer Eloy Rubio Blanco said that wasn’t true.
“Do I think our drivers believe that our cars can drive themselves?” Rubio said “No,” according to a trial recording that Reuters saw.
If Tesla says its cars can drive themselves, the U.S. Department of Justice will look into whether the company is breaking the law. The National Highway Traffic Safety Administration has also been looking into how well Autopilot works after finding more than a dozen accidents where Tesla cars hit emergency vehicles that were stopped.
An expert at Guidehouse Insights named Sam Abuelsamid said that Tesla’s disclaimers give the company strong defenses in a civil case.
“I think that anyone is going to have a hard time beating Tesla in court on a liability claim,” he stated. “Regulators need to address this,”