Like in airplanes, Teslas are equipped with "black boxes" that log critical information in the seconds and minutes leading up to accidents and crashes. They record details like braking, airbag deployment and other measurements.
While conventional vehicles can have their Black Boxes accessed widely, cars with automated driving technologies can only have their information accessed by the manufacturer. This means that any potential investigation hinges on information that would need to be handed over by the company that's being investigated.
And so Bloomberg's Dana Hull asks a good question: should we be worried that the only people that have access to this information are the manufacturers themselves?
And we'll ask another question: isn't this even more true when, say, that company perhaps doesn't have the best track record for unbridled honesty?
Sam Abuelsamid, principal analyst for Navigant Research in Detroit said: “We should not ever have to rely on the manufacturer to translate this sort of data, because they are potentially liable for product defects and they have an inherent conflict of interest.”
“As we deploy more vehicles with these partial automation systems, it should be mandatory to record information about the state of automation and the driver,” he continued.
After one 2016 crash in Florida, the NTSB came out and called for the Transportation Department to define what data should be collected.
In the 2016 crash report, the NTSB commented: “As more manufacturers deploy automated systems on their vehicles, to improve system safety, it will be necessary to develop detailed information about how the active safety systems performed during, and how drivers responded to, a crash sequence. Manufacturers, regulators, and crash investigators all need specific data in the event of a system malfunction or crash.”
Meanwhile, the NHTSA says it is reviewing a crash in California involving a Tesla that took place this weekend. They have not commented on whether or not Autopilot was engaged at the time of the accident.
Raul Arbelaez, vice president of the Insurance Institute for Highway Safety’s Vehicle Research Center, said: "The the inability to readily access that data impedes the ability of researchers to understand how automated driver-aids are performing in the field, especially in less-severe crashes that account for the vast majority of traffic collisions."
He continued: “How do people interact with these technologies? What conditions do they tend to work in? Do they work really poorly at night with snow and rain, or are they excellent under those conditions? I’m sure it is very useful for the auto manufacturers to help them improve their products down the line, but in terms of understanding how the current fleet is performing, we really don’t have access to that in these crashes that are happening very quickly without working with the manufacturers.”
In the case of the 2016 Florida crash, the Tesla didn't have an event data recorded that could be read by widely available tools. Instead, the company had to turn over the information that revealed the driver was using Autopilot.
The last rule to govern these devices, made in 2006, requires 15 parameters that the NTSB says “are inadequate to comprehend even the simplest questions of who/what controlled an automated vehicle at the time of a crash”.
The NTSB continues to investigate several crashes involving Teslas and Autopilot:
The NTSB is investigating other Tesla vehicles crashes that occurred while Autopilot was in use, including a March 2018 fatal collision in Mountain View, California. The NHTSA, meanwhile, has launched probes into 13 crashes that it believes may have occurred while drivers were using Autopilot, including a Dec. 7 crash in Connecticut in which a Tesla driver rear-ended a parked police cruiser.
The agency is taking interest in how the new technology is being used, the article claims. However, we can't help but believe the NTSB and the NHTSA have been dragging their feet at taking what we believe to be an extremely dangerous "feature" off the roads.
We have often asked how many more people need to wind up decapitated, like the 2016 driver, whose Tesla drove itself under a semi-truck while trying to change lanes, for regulators to act.