News
Last fall California joined Florida and Nevada in making it expressly legal to test self-driving cars on public roads, directing the Department of Motor Vehicles to draft regulations for consumer use of the cars by January 2015. Senate Bill 1298 takes us a step closer to a future in which drivers can read or nap while their cars navigate the road, although most agree that such advanced technologies are at least a decade away. We spoke about the significance of the new law with Stanford University's Bryant Walker Smith, a fellow with both the law school's Center for Internet and Society and the engineering school's Center for Automotive Research.
What is a self-driving, or autonomous, vehicle?
The legislation tries to draw a bright line between what the state deems autonomous technology - technology that can drive a vehicle without the active physical control or monitoring by a human - and everything else. In practice, this could be hard to draw. What about current "automated" technology?
After looking at the vehicle codes in all 50 states, I concluded a system that performs its own braking, steering, and accelerating is not necessarily illegal. However the law will introduce complications for the operation and design of such a system. For example, what is reasonable - a word used frequently in motor vehicle codes - when applied to a computer? What does all this mean for texting and driving?
California hasn't addressed that specific question, but Nevada said that if a person is operating an autonomous vehicle that is driving autonomously, then they are allowed to text. But all other duties imposed on a driver continue to apply. The classic example is, if you send your car to park itself and then you get drunk in a bar while it is doing that, you are committing drunk driving in Nevada. Who is liable when one of these cars is in an accident?
When it comes to civil liability, the initial answer is that probably almost everyone is a potential defendant - from the operator to those companies who designed any subsystems or provided the important data or algorithms. The question is what roles insurance and data will play. And how will doctrines like foreseeable misuse evolve? What are some of the other big legal issues?
Privacy and security both have important legal dimensions. For example, it will be important to ensure that the humans are performing the roles that they're supposed to in autonomous cars. Are they engaged in driving? There are technologies that would potentially monitor that performance. Also, [wirelessly] connected vehicles could communicate with each other about the environment and any dangers. But how do you trust the data that's being received, how do you protect a system from hacking? More broadly, are all these decisions for governments, legislatures, or agencies? For consumers, for companies, or judges and juries?
The legislation tries to draw a bright line between what the state deems autonomous technology - technology that can drive a vehicle without the active physical control or monitoring by a human - and everything else. In practice, this could be hard to draw. What about current "automated" technology?
After looking at the vehicle codes in all 50 states, I concluded a system that performs its own braking, steering, and accelerating is not necessarily illegal. However the law will introduce complications for the operation and design of such a system. For example, what is reasonable - a word used frequently in motor vehicle codes - when applied to a computer? What does all this mean for texting and driving?
California hasn't addressed that specific question, but Nevada said that if a person is operating an autonomous vehicle that is driving autonomously, then they are allowed to text. But all other duties imposed on a driver continue to apply. The classic example is, if you send your car to park itself and then you get drunk in a bar while it is doing that, you are committing drunk driving in Nevada. Who is liable when one of these cars is in an accident?
When it comes to civil liability, the initial answer is that probably almost everyone is a potential defendant - from the operator to those companies who designed any subsystems or provided the important data or algorithms. The question is what roles insurance and data will play. And how will doctrines like foreseeable misuse evolve? What are some of the other big legal issues?
Privacy and security both have important legal dimensions. For example, it will be important to ensure that the humans are performing the roles that they're supposed to in autonomous cars. Are they engaged in driving? There are technologies that would potentially monitor that performance. Also, [wirelessly] connected vehicles could communicate with each other about the environment and any dangers. But how do you trust the data that's being received, how do you protect a system from hacking? More broadly, are all these decisions for governments, legislatures, or agencies? For consumers, for companies, or judges and juries?
#321798
Kari Santos
Daily Journal Staff Writer
For reprint rights or to order a copy of your photo:
Email
jeremy@reprintpros.com
for prices.
Direct dial: 949-702-5390
Send a letter to the editor:
Email: letters@dailyjournal.com