Tesla’s Autopilot Feature Blamed in Lawsuit for Fatal Crash in Japan
Tesla has been sued by the family of a 44-year-old Japanese man who was killed when a Model X operating on Autopilot crashed into a group of people gathered on the side of an expressway near Tokyo. According to a Bloomberg news report, the driver of the Tesla had fallen asleep shortly before the crash when another vehicle ahead of him changed lanes to avoid the group. Still, the Model X, which was set on Autopilot, accelerated and ran into the group, the lawsuit filed in federal court in San Jose states. Tesla’s headquarters are in Palo Alto in Northern California.
Allegations of Negligence
The complaint, filed by the widow and daughter of the deceased victim, Yoshihiro Umeda, alleged that the fatal accident was the result of Tesla’s defective Autopilot system including inadequate monitoring of whether the driver is alert and a lack of safeguards against unforeseen traffic situations. Tesla’s Autopilot system has been involved in other fatal crashes such as a 2018 incident in Mountain View, California, when a Model X on Autopilot slammed into a concrete barrier.
Umeda’s family stated in the complaint that Tesla will likely portray this accident “as the sole result of a drowsy, inattentive driver in order to distract from the obvious shortcomings of its automated driver assistance technology.” The family sued Tesla for defective design, failing to warn customers, negligence and wrongful death. Umeda was with a group of motorcyclists who were standing behind a van at the far right side of the Tomei Expressway after an earlier traffic collision. According to the complaint, this is Tesla’s first Autopilot-related pedestrian death.
Issues with Tesla’s Autopilot
The problems with Tesla’s Autopilot feature is not a new occurrence. Earlier this year the National Transportation Safety Board (NTSB) said lack of regulation of the Autopilot system was to blame for at least two fatal crashes involving Tesla vehicles where the Autopilot feature was engaged.
NTSB said the design of the Autopilot system was dangerous at least in one fatal accident because it allowed the Tesla driver to avoid paying attention. Tesla also failed to limit where the Autopilot feature can be used, allowing motorists to activate it even in areas for which it was not designed, the NTSB report stated.
Mixed Messaging and Safety
In this particular case in Japan, it appears that the driver felt comfortable enough to doze off and let Autopilot take over. Tesla’s messaging with regard to drivers staying alert even when the vehicle is on Autopilot has clearly not gotten through. Instead, drivers are lulled into a false sense of security with the belief that the Autopilot feature is sophisticated enough to handle traffic situations.
Our auto defect attorneys have consistently maintained that driverless or semi-automated technology should not be made available until it is ready for real-time roadway conditions. In the case of Umeda’s accident, Autopilot simply did not do the job and he paid the price with his life and a family lost a husband and a father.