Tesla Model Fatal Crash Caused by Autopilot Feature
The National Transportation Safety Board (NTSB) has issued a preliminary report confirming that Tesla’s self-driving feature, also known as Autopilot, was active when a fatal car crash occurred March 1 between a Tesla Model 3 and a semi truck in Florida. The report, which included an image of the Tesla with its roof sheared off, confirms that the Autopilot feature was active when the crash happened and that the driver’s hands were not on the steering wheel for eight seconds before impact. The NTSB has said it will continue this investigation and look into factors such as the driver’s action and highway conditions because giving a final determination of probable cause.
Tesla Fatal Crash and Autopilot Investigations
The fatal car versus truck accident occurred March 1 in Delray Beach. According to the report, the car’s Autopilot was engaged 10 seconds before the Model 3 crashed into a tractor-trailer crossing State Highway 441 about half an hour before sunrise. The driver, 50-year-old Jeremy Banner and the Autopilot system did not slow the car or attempt any evasive maneuver, investigators said. The car’s computer did not detect hands on the steering wheel for eight second prior to the impact. Banner was killed when his car became wedged under the trailer. The Tesla’s roof was shorn.
In a 2016 crash involving a Tesla and a large truck, also in Florida, the NTSB had determined that the truck driver had failed to yield to the oncoming vehicle and that the Autopilot was “imperfect.” However, it did not go far enough to deem the Autopilot dangerous. The NTSB is also investigating another Autopilot fatality involving a Tesla Model fatal crash in California last year when a car struck a highway median.
Autopilot is Dangerous
It appears from the NTSB’s preliminary report that the driver put the Tesla on Autopilot and likely took his eyes off the road. Many drivers with Tesla vehicles tend to rely on their Autopilot systems. It seems that the car’s sensors simply did not sense or see the tractor-trailer. The driver likely had no idea what was coming if Autopilot did not sound the warning sign for the driver to put his hands back on the wheel simply because the car’s systems did not detect the tractor-trailer.
Tesla’s drivers have obviously no proper knowledge of the technology’s limitations. Instead, they are lulled into a sense of security and believe that the Autopilot system works like a driverless car, when in reality, it doesn’t. The name “Autopilot” is in itself misleading. If it is determined that the Autopilot system is inadequate, defective and dangerous, Tesla should be held accountable for the irreparable damages these semi-autonomous vehicles have caused. These incidents should also send out a red alert to developers of driverless cars to test their technology and iron out those glitches before putting their vehicles on the market.