This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Technology,
Government,
Civil Litigation

Jun. 1, 2022

Self driving car expectations on a collision course

While other automakers have had lawyers draw up dreamy names to call attention to the fact that the car won’t drive itself, Tesla’s Autopilot seems like a bold declaration by comparison.

Miguel A. Custodio

Partner, Custodio & Dubey LLP

Phone: (213) 593-9095

Email: custodio@cd-lawyers.com

UCLA SOL; Los Angeles CA

Miguel has dedicated his law career to pursuing justice on behalf of people physically or mentally injured by faulty or defective products or negligence; and family law matters.

We have seen legislators and regulators let technology outpace the law time and again. As deaths and lawsuits related to autonomous vehicle malfunctions and misuse rack up, they seem once again content to hope the system works itself out.

We saw it with the Copyright Act of 1976, a largely still unaltered document under which music rights have become a bizarre and creatively stifling mess. We saw them shrug from the sidelines as internet access, and then the internet itself, were monopolized by a handful of companies.

Damaging as their inaction was in those cases, a hands-off approach to the issue of cars that drive themselves has already cost lives and livelihoods. Left unchecked, it will cost countless more because the technology remains rife with problems and is still a long way from full autonomy.

We are already seeing some devastating consequences. Prosecutors recently charged Los Angeles resident Kevin Riad with two counts of vehicular manslaughter in connection with a deadly 2019 Tesla crash. The car’s Autopilot feature was engaged when the vehicle ran through a red light and killed two people, making him the first person in the United States to be charged for a fatal crash connected to the use of a partially automated driving system.

He likely will not be the last, as he certainly isn’t the first to seemingly overestimate the car’s autonomous capabilities. You can easily find videos and stories about drivers causing accidents while over-relying on Autopilot – many featuring shocked owners who seemed to believe their partially automated driving system was a genuine “auto-pilot.”

While other automakers have had lawyers draw up dreamy names to call attention to the fact that the car won’t drive itself, Tesla’s Autopilot seems like a bold declaration by comparison. The company has aggressively pushed to sell its vehicles based on that feature alone.

The National Highway Traffic Safety Administration recently announced a federal investigation into a May 12 crash along Pacific Coast Highway in Newport Beach in which three on-site workers were killed by a Tesla using the Autopilot feature. Investigators have been sent to 34 crashes since 2016 in which autonomous driving systems were at least suspected to be in use, with over 80% of those reported incidents originating from Tesla cars.

Tesla has faced a slew of “recalls” in recent months to address serious issues. Many of these recalls stem from a software update that Tesla could quickly disable or address with another update – meaning in most cases there is no actual physical recall. These “soft recalls” are a relative slap-on-the-wrist, and likely have not done much to dissuade the company from rolling out new software that puts their customers in the role of beta tester.

The case of Riad, who has pleaded not guilty, will likely set groundbreaking precedent on civil and criminal liability in autonomous vehicle crashes. The fact that he was charged is interesting in its own right, as negligent drivers involved in similar incidents involving non-autonomous vehicles often do not produce criminal charges, let alone a manslaughter charge. It is likely the driver will bear the lion’s share of liability in court, absent some proof of a malfunction that prevented them from retaking control of the vehicle in an easy, accessible way.

In California, automaker liability essentially boils down to what the customer was seemingly promised. Tesla may be held liable, for example, if it is found that Riad’s overconfidence in the autonomous driving function was directly attributed from Tesla advertising that overstated the car’s self-driving capabilities.

There is a reason other automakers have opted for more guarded names along the lines of “Enhanced Cruise Control’’ and “Driver Assist.” Namely the technology is not there yet, and it’s completely misleading to suggest it imminently will be. It is tech’s variation of the 80/20 rule; tech engineers often say it isn’t too difficult to get 80% of the way towards creating new technology, but it is the last 20% that is difficult and possibly unattainable.

It is a subtlety the company’s CEO has not helped much to clarify, even suggesting back in January 2016 that someone could push a button and have their car drive itself across country when he tweeted: “In ~2 years, summon should work anywhere connected by land & not blocked by borders, e.g. you’re in LA and the car is in NY.”

He’s been tweeting similar statements ever since.

A wise and responsible motorist would take these statements with a grain of salt, and log it up to a bit of cheerleading from an idealistic and charismatic CEO. One rush hour trip down the 101 freeway should be enough to remind you that you are not surrounded by wise and responsible motorists. It is unfortunate that our roads and highways have irresponsible drivers, but that’s the reality and regulators and legislators are responsible for keeping technology interpretations in check.

Whether we like it or not, there are drivers who are not reading their car’s warnings and disclosure agreements, and who also believe that 80% fully autonomous is “close enough.” If the recent stunt in Echo Park has taught us anything, it is that just because someone has the wealth and means to get behind a Tesla does not mean they are responsible enough to drive it.

We likely will not be able to count on the California DMV to resolve this problem, as the department seemingly refuses to move forward on its own investigation of Tesla’s Autopilot feature.

Tesla also recently disbanded its media response department. Around the same time – on May 20 – Musk tweeted out aspirations of forming an in-house litigation department for his company that would “directly initiate & execute lawsuits” that would directly report to him.

Does that sound like the actions of a car company you would trust with your own life?

Future news will be littered with cases similar to Riad – absent some major action by either the NHTSA or state and federal legislators. I think autonomous vehicles are a revolutionary technology with the potential to transform transportation as we know it. But the consequences will be costly if we once again allow our laws to lag significantly behind the technology they dictate.

#367719


Submit your own column for publication to Diana Bosetti


For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com