Caught On Video: How A Tesla Auto-Pilot Almost Killed Its Occupants

Authored by Fred Lambert via Electrek.co,

Tesla and the U.S. National Transportation Safety Board (NTSB) are both investigating the fatal accident involving a Model X Autopilot in Mountain View last month, but now another Tesla owner also conducted his own little investigation into the accident by following a similar scenario on Autopilot and almost crashed on video by doing so.

After reviewing the data logs of the vehicle last week, Tesla confirmed that the Model X was on Autopilot and explained the last moments before the impact:

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.

The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Now another Tesla owners tried to film his Model S following the same lane change scenario on Autopilot in an almost identical section of road in Chicago and it might show exactly what happened during the accident:

Electrek’s Take

Let’s start with an obvious disclaimer that we don’t recommend people filming themselves driving on Autopilot by holding a camera.

It slows down your ability to take control when needed which can dangerous as seen on the video, but in this case, it interestingly illustrated exactly what might have happened moments before the tragic fatal crash.

We can see the driver ignoring an alert to ‘hold the steering wheel’ sent out a few seconds before the barrier just like Tesla said in its report based on the logs – though that was likely a time-based alert.

Then it seems like Autopilot’s Autosteer stayed locked on the left line even though it became the right line of the ramp. The system most likely got confused because the line was more clearly marked than the actual left line of the lane.

That led the car directly into the barrier and it’s easy to see how a driver who is not paying attention couldn’t have been able to react in time since the driver who recreated it was barely able to apply the brake in time himself.

It’s a tragic event that serves as a reminder that Autopilot is still not perfect and that drivers need to pay attention at all time and be ready to take over.

Comments

Dickweed Wang BurningFuld Mon, 04/02/2018 - 14:16 Permalink

From recent reports there were over 80,000 hours driven using Tesla's auto-pilot on 101 in CA where the recent crash occurred with no prior incidents.  That seems to prove that certain people should not be trusted using high tech equipment when they're responsible for insuring their own, and other people's, safety.  That kind of thing should be treated just like in the aviation industry where any pilot(s) that failed to continuously monitor the auto-pilot and flight conditions at all times during a flight would lose their pilot's license if that was found out.

In reply to by BurningFuld

Hans-Zandvliet Occident Mortal Mon, 04/02/2018 - 18:02 Permalink

Exactly.

Reminds me of the difference between how Russians and Americans solved the problem of developing a pen that funcions in space (absent gravity to let the ink flow out of the pen point), back in the days of the Space Race.

NASA spent several millions to develop such a space pen. Russians just used a pencil :-D

Same goes, by the way, for the presently reignited arms race. With a military budget of one tenth of the USA's, Russia out-performs America by just being sensible about it.

In reply to by Occident Mortal

ldd Hans-Zandvliet Mon, 04/02/2018 - 20:36 Permalink

some of those pens require abilities many countries do not possess. those fine ball point pens require such precision that until recently korea imported the components from japan until korea managed to produce them on their own.

the most advanced tech will be disabled by the weakest link. simple and reliable is the way to go. probably why russia still uses low tech in some applications. and considering the attention span of people today...

In reply to by Hans-Zandvliet

IH8OBAMA whatswhat1@yahoo.com Mon, 04/02/2018 - 12:25 Permalink

So, it appears that the Tesla self driving software does not stop for or recognize obsticals in its path.  I guess it would have run over that girl with the bicycle that the Uber car killed, too.

If you see an Uber or Tesla on the road.  Get in front of it and slam on the brakes and sue (I brake for dogs and squirrels crossing).  he he he he he  These cars shouldn't be allowed on the road until they are proven safe on a confined course.

 

In reply to by whatswhat1@yahoo.com

IH8OBAMA Theta_Burn Mon, 04/02/2018 - 12:52 Permalink

The cars aren't fine.  And, the monitors sitting behind the wheel can't be expected to be paying attention for hours when they aren't involved in the actual driving.  That would have to be the most boring job in the world up until you hit the un-moveable bridge abutment.

And those semi tractor-trailers are the scariest things on the road.  I don't like being near a semi on the road as it is.  You're right about the big accident coming with one of those.

 

In reply to by Theta_Burn

IH8OBAMA J S Bach Mon, 04/02/2018 - 15:05 Permalink

Wait until they begin trying total auto-pilots on commercial airlines and fire all of the real pilots.

OVER THE SPEAKER: "Folks, this is your Auto-Pilot.  Due to a strong headwind, we are almost out of fuel but I calculate that we can just make it to our destination without diverting.  Please stay in your seats with the seatbelt buckled just in case anyway.  I have locked the bomb-proof cockpit door during this period of uncertainty to avoid distraction from crew and passengers."

In reply to by J S Bach

tedstr homiegot Mon, 04/02/2018 - 12:25 Permalink

There are probably 100s of thousands of "give a shit" actions performed by people everyday  thats not part of their job and they get no extra pay for it.  Once society starts to break down and the social equilibrium breaks, the give s h it work stops and all kinds of things break and people die.  Its called socialism

In reply to by homiegot

hannah JoeTurner Mon, 04/02/2018 - 13:37 Permalink

you cant 'write' AI code.....AI would mean that the computer program learns adapts on its own like a brain in a child. code doesnt work like that. this tesla AI code is just millions of if/then/else statements which means they would have to code every possibility that exist in the real world. the code would be  trillions of line.

 

computers cant think yet so this shit isnt possible....

In reply to by JoeTurner

dgc0101 JoeTurner Mon, 04/02/2018 - 14:11 Permalink

Perhaps, but I think that the real message here is that this is a problem with the the iPhone. 

 

Ultimately, we will need better apps to help deal with Tesla's software problem. Unfortunately, NSW Trauma (https://itunes.apple.com/us/app/nsw-trauma/id1030612905?mt=8) just won't cut it if your unconscious due to massive hemorrhagic shock arising from craniocerebral or other trauma.

 

So, the real question will be how quickly can machine learning algorithms learn that trauma is not a good thing for car passengers OR how to tactfully notify the next of kin. It's just not clear at this time.

In reply to by JoeTurner

AGuy JoeTurner Mon, 04/02/2018 - 17:34 Permalink

"Does Tesla use H1-B labor to write its software ?"

FWIW: Autopilot systems use deep learning neural nets because its not possible to actually program a car autopilot using traditional software development systems: its just to complex. The Neural net is trained using real world conditions.

However the size of the Neural net is limited and does really have the capacity to handle the real world. At current hardware capabilities, you would need the equivalent of a large datacenter inside the car. Perhaps in 5 to 10 years new chips will be developed that have enough nets to accommodate most real world conditions.

In reply to by JoeTurner

whatswhat1@yahoo.com bunkers Mon, 04/02/2018 - 11:56 Permalink

The video looks very Chicago-ish, where there are way too many idiotic road hazards created by someone's pay-rolling, dumb brother-in-law.  One of my best friends died when he crashed, head on, into a concrete abutment.  The road was horribly designed in a way to create an optical illusion.  He was one of many who died or was injured at that location.  Although we filed complaints about the trap, it's still there, after decades of carnage.

In reply to by bunkers