Tesla’s controversial Autopilot assistance function has stood up in a California court. On Tuesday, October 31, the jury decided that Autopilot was not at fault in a 2019 crash in Riverside County that killed a driver and his wife and son were badly wounded. It is the second time this year that a jury has ruled in favor of Autopilot, saying it was not responsible for a serious accident.
The lawsuit was filed by both survivors of the crash victim, who believed an Autopilot malfunction caused Micah Lee’s Tesla Model 3 to drive off the highway at 65 mph before crashing into a tree and bursting into flames. Lee died in the accident and his wife and then eight-year-old son were seriously injured. The plaintiffs demanded 400 million dollars (converted to about 9.3 billion crowns) plus punitive damages.
Jury: Tesla is innocent!
The lawsuit accused Tesla of knowingly sold dangerous experimental software to the public and that safety flaws in the system led to the accident. In particular, it was supposed to be a specific steering problem that Tesla knew about. One of the survivors said Autopilot was on at the time of the accident.
From the beginning, Tesla denied that Autopilot was not working properly. She responded to the lawsuit by saying the driver had consumed alcohol before driving, however the driver’s blood alcohol level was measured at 0.05%, which is below California’s legal limit of 0.08%. As another argument, she stated that with Autopilot on, the driver is still responsible for driving the vehicle.
Nine out of twelve members of the jury found the car manufacturer in the right after four days of deliberations. Similarly, in April of this year, a jury in Los Angeles ruled that Autopilot was not responsible for Justine Hsu’s 2019 Model S going off the road, crashing into the median and injuring her face.
The previous defense relied on the same argument – drivers are still responsible for what happens behind the wheel, even when Autopilot or Full Self-Driving is on (despite the latter system’s name suggesting otherwise). The Full Self-Driving system was not publicly available at the time of Lee’s accident, although he had purchased it for $6,000 and expected it to be available in the future.
Tesla can’t quite drive itself
Both of Tesla’s autonomous systems are on the SAE scale for driving automation at “level two”, as well as most other autonomous driving systems currently on the market. Although Autopilot is designed for highway driving, the Tesla FSD system can be activated in more situations than most cars. However, there is no situation in which the car takes complete responsibility for driving – the responsibility always rests on the shoulders of the driver.
Tesla is currently facing several lawsuits over injuries and deaths that have occurred in its vehicles, many of which claim that Autopilot or the FSD system are to blame. In one of them, Tesla tried to argue in court that Musk’s recorded statements about self-driving “could have been deep fake.”
The questions raised in this process are interesting and difficult to answer because they combine the concepts of legal liability versus marketing materials versus public perception. Tesla in official materials, for example in user manuals, in the car software itself, etc., makes it very clear that drivers are still responsible for their vehicle when using Autopilot.
Drivers must agree to this the first time the system is turned on, but that may be the catch. This is because people are used to accepting long contracts whenever they turn on any system or use any technology without carefully reading their contents.