Driverless Cars: How Safe is Safe Enough?
The march toward completely driverless cars is progressing quite quickly. However, when it comes to these autonomous vehicles, the fundamental question everyone from the auto industry to the government and safety advocates are asking is: How safe is safe enough when it comes to autonomous vehicles? Consumer Reports says automakers have been taking a wide range of approaches as they sail in these unchartered waters.
The All-Important Safety Issue
Tesla, for example, has been rolling out autonomous features in its cars incrementally when the company decides they are road-ready. Then there is Waymo, formerly Google’s self-driving car project, which is abiding by a more conservative premise that these driverless cars should not be sold until human action is not required at all. A whole bunch of other companies, which are actively developing these technologies, fall somewhere in the middle of these two approaches.
These companies – be it automakers or tech companies – are faced with a number of ethical questions. Is it fine to use human drivers as test subjects? How do humans react when the autonomous system fails? The answer to these questions, of course, is that we would not and should not accept the idea of imperfect, untested machines endangering us on our roadways. While we tolerate human error because we accept that it’s human to err, we don’t have that same tolerance toward robots. We expect them to be perfect, and we should.
The Tesla Crash: A Wakeup Call
A wake-up call in this regard was the fatal accident in Florida involving a Tesla semi-autonomous vehicle in May 2016. An investigation conducted by the National Highway Traffic Safety Administration (NHTSA) said there was no safety defect at the time. Tesla said neither the driver not the Autopilot responded to the white tractor-trailer that was making a turn ahead. But the lesson here is that drivers should be made aware of their vehicles’ limitations and capabilities. It wasn’t until after this fatal accident that Tesla started emphasizing that drivers should be ready to take over from the Autopilot in a moment’s notice and that it is not a fully autonomous feature.
Danger of Mixed Messages
Consumer Reports has maintained since the deadly Tesla crash that automakers like Tesla are sending consumers mixed messages by rolling out these systems in a way that gives drivers a false sense of security. Motorists feel like they can take their hands off the wheel in spite of warnings to do the opposite.
The auto defect attorneys at Bisnar Chase believe that it’s unacceptable to use human beings as test subjects for this new technology. We welcome innovation, but it should not come at the expense of human lives. And automakers certainly need to do a much better job of telling consumers what their autonomous or semi-autonomous vehicles can or can’t do. It’s about time they made safety – not profits – their top priority.