Self-driving cars will undoubtedly become a widely-acceptable new technology in the near future. It’s been stated that with self-driving cars on the road, “there’s good reason to believe that tens of millions of traffic fatalities will be prevented around the world.”
But, as with any new technology, mistakes can happen. New computers get viruses; pharmaceutical drugs are recalled – we’ve seen this time and again. These mistakes can be dangerous. We learned how dangerous self-driving cars could be this May when Joshua Brown, a former Navy SEAL, lost his life when his Tesla collided directly into a tractor trailer in Williston, Florida, on autopilot. Tesla’s public statement surrounding this tragic accident claimed that the “car’s autopilot feature failed to notice the white side of a tractor trailer against a brightly lit sky.”
The huge automakers that are rushing to innovate this new hands-free driving technology are making efforts to promote regulations favorable to them. And they can do so via “federal preemption”: the invalidation of a US State Law which conflicts with a Federal Law. With federal preemption on their side, these huge automakers would have a federal defense that “would forever trump the right of any individual citizen involved in a self-driving vehicle crash from seeking relief in the courts.”
Should this be the case, essentially any individual whose autopilot causes an accident is the only one to blame – no matter what. Over the past several years, the National Highway Safety Traffic Administration (NHSTA) has been developing guidelines/model legislation for self-driving automobiles. The catch – the NHSTA has only consulted with automakers and software developers on these guidelines and legislation; NOT with the general public or consumer safety advocates. To us, that’s simply reckless. Those who profit off these self-driving cars are being given the chance to protect themselves while those who are injured due to their failure are not.
If these self-driving cars make mistakes, causing serious accidents, those who are injured have nowhere to turn to make up for the financial losses they’ve incurred from their injuries. They could lose everything they’ve worked for in life and more. Even if the manufacturer is at fault, there’s nothing they can do. Not even families who lose loved ones will be able to fight these giant automakers.
And this scenario could be a snowball rolling downhill. This could prevent recalls from taking place. Every automobile crash is different. Usually, the act of determining whether a driver of an autonomous vehicle made a mistake during a car crash or if the vehicle’s autopilot was in fact defective and at fault would be a job for a civil jury. But, this type of analysis won’t be possible should these automakers obtain the protection of federal preemption, barring individuals from the right to trial by jury. So, there could easily be a scenario in which an entire model of autonomous vehicles’ autopilot is defective, and the millions of individuals driving these cars won’t have any idea. To be frank, this would essentially eliminate the incentive for auto-manufacturers to ensure production of the safest possible cars.
This is why the public needs to do their best to get themselves involved in the decision-making processes with self-driving cars, now. Advancements in technology should always lead to ethical questions, and perhaps even to change. The public should have the right to know how their world is changing and what it means for their lives. In this scenario, we believe the right for individuals to hold automakers accountable for their cars is a must.