"Why Do They Keep Doing It?" Tesla On Autopilot Crashes Into Parked Laguna Beach Cop Car

Yet another Tesla on Autopilot has slammed into an inanimate object. 

One of the biggest controversies surrounding Tesla right now is the company's Autopilot feature that it is included with vehicles and sold to the public as a feature that provides autonomous driving. Critics of the company have been quick to point out that at the Model 3 handover event, Elon Musk basically said that people could "sleep" while in their cars with Autopilot engaged - and as the toll of accidents involving Autopilot continues to accelerate, it is becoming more and more obvious that this isn’t even close to being the case.

The latest incident was this morning, where it was reported on Twitter by Laguna Beach police that a Model S sedan traveling with the Autopilot function on slammed into a parked Laguna Beach police car. The Laguna Beach police department tweeted the news along with photographs:

As The LA Times reported, Laguna police Sgt. Jim Cota said, "thankfully there was not an officer at the time in the police car," adding that "the police car is totaled."

However, Cota said that a year ago in the same area there was another collision involving a Tesla running into a semi-truck.

"Why do these vehicles keep doing that?" Cota said.

"We're just lucky that people aren't getting injured."

Critics continue to point to Elon Musk's statements during the Model 3 handover event, where he basically told a crowd that Tesla vehicles are capable of driving themselves, allowing operators to "watch movies, talk to friends [and] go to sleep" while at the wheel.

"In the future, really - the future being now - the cars will be increasingly autonomous. So, you won't really need to look at an instrument panel all that often, you'll be able to do whatever you want. You'll be able to watch movies, talk to friends, go to sleep. Every Tesla being produced now, the Model 3, the Model S, the Model X, has all the hardware necessary for full autonomy."

How could this keep happening? Yesterday, video was posted showing what is allegedly a Tesla in Autopilot mode having a bout of "confusion" when the road being traveled changes from a straight road past a section on a highway that offers an exit by the road dividing. The video appears to show the car unable to determine which lane to stay in and nearly hitting the center median that divides the roadway from the exit.

Today's incident comes after a report we put out just yesterday noting that a driver's Model 3 European tour had been cut short because the driver claimed that the vehicle veered into the median while driving in Greece.

The accident involves driver You You Xue, who in early 2018 famously toured his Model 3 across North America after he bought it in order to show other reservation holders what it looked like. That road trip was documented in all of its glory on electrek and was presented as a feel-good story.  You You felt so good that he decided to repeat the trip across Europe to continue to garner free publicity for the Model 3. This trip was less of a resounding success, and ended with the car driving itself into a median in Greece.

On You You's Facebook page, he tells the story of the crash:

The road trip is over. I'm sorry.

Vehicle was engaged on Autopilot at 120 km/h. Car suddenly veered right without warning and crashed into the centre median (edit: divider at the exit fork). Both wheels (edit: one wheel) completely shattered, my door wouldn't even open correctly. I'm unharmed.

The driver also posted photos of the fork in the road where the accident took place.

That comes after another recent development, just days ago, that an accident in Utah and involving a Tesla ramming into a fire truck was also found to have been the fault of Autopilot and that the vehicle accelerated before crashing.

Police results from a recent Salt Lake City crash were released, indicating that not only was the car was in Autopilot mode when it crashed into a stopped firetruck,  but also that it sped up seconds before the moment of impact.

The police report was detailed as follows:

A Tesla Model S that crashed into a parked firetruck on a Utah highway this month while in its Autopilot mode sped up prior to the accident, a police report says.

Data retrieved from the sedan shows that it picked up speed for 3.5 seconds shortly before the collision in South Jordan, according to the Associated Press. The acceleration from 55 mph to 60 mph suggests that the Tesla had been following a slower car that then moved out of the way, allowing the Tesla to resume the higher speed that the Autopilot system had been set at.

Tesla's Autopilot has been the subject of previous scrutiny following other crashes involving the vehicles. In March, a driver was killed when a Model X with Autopilot engaged hit a barrier while traveling at "freeway speed" in California. NHTSA and the National Transportation Safety Board are also investigating that case.

Earlier this week, Tesla claimed its Autopilot was not engaged when a Model S veered off a road and plunged into a pond outside San Francisco, killing the driver. The NTSB has yet to confirm Tesla's version of events.

Earlier in May, the NTSB opened a probe into an accident in which a Model S caught fire after crashing into a wall at a high speed in Florida. Two 18-year-olds were trapped and died in the blaze. The agency has said it does not expect Autopilot to be a focus in that investigation.

In the latest case, we have seen the playbook before: it probably won't be long before Tesla releases a statement blaming the driver.

Comments

J S Bach NugginFuts Tue, 05/29/2018 - 17:35 Permalink

In a selfish way, I'm glad this is happening.  The idea of "automated driving" is anathema to me.  What the modern-day spoiled-brat American Prince/Princess needs is more HANDS-ON activity... not less.  Even turning a light on and off is too much work for some people these days.

I'll drive my own car until I'm in my 90s with my fedora hat on, white-knuckled 10 o'clock-2 o'clock death grip on the steering wheel, tongue hanging out, squinting eyes under coke-bottle glasses, flipping off 🖕🏻 any young whipper snappers honking at my cautious 45 mph pace... before I trust any f***ing computer to carry my sorry ass anywhere. 🧟‍♂️🚙

In reply to by NugginFuts

DeadFred MasterPo Wed, 05/30/2018 - 00:49 Permalink

It’s a bigger question than you think. Elon is Deep State friendly so the Clowns In America is probsbly not responsible. The clowns are at war with NSA and military intelligence so maybe they’re doing it. There are lots of third party candidates. It could be as simple as someone who is shorting Tesla stock. 

In reply to by MasterPo

inhibi Team_Huli Tue, 05/29/2018 - 18:09 Permalink

Elon needs a wake up call. He needs to stop doing all of those drugs with that drugged out Gaga-wannabe girlfriend of his.

Seriously, he's dating an (ugly) crazy girl that made her debut by doing Molly for a week straight and recording unintelligible verses over shitty electronic music. She literally said those words, this isn't a theory.

And now she says she's "Off drugs because I have <friends> that have died <from it>". Right. Nothing like a musician (if you can call her that) saying they are off drugs (because all their friends are on them).

Elon should focus more on his failing businesses rather than his apparent teenage angst.

I swear, these new age billionaires get more drugged out and more immature as they get older. I doubt Elon has a single clue as to how the autopilot system even works.

In reply to by Team_Huli

Bud Dry BurningFuld Tue, 05/29/2018 - 18:25 Permalink

Tech is really fucking people up though...  A lady I know is almost twice my age so she grew up using maps to travel.  Well...  she is going on a trip and is freaking out because she can't get her maps app to talk to her, it just shows directions and that's not enough.  I even told her, we all used to have paper maps and no cell phones and we were fine.  She kind of accepted that but was still so worried.  If an EMP goes off I can definitely see a large amount of people die quickly.

In reply to by BurningFuld

Duc888 J S Bach Tue, 05/29/2018 - 18:42 Permalink

 

 

This is by design.  Once they arrive at an "acceptable" level of fatalities (same as airlines do and scheduled maintenance is timed / based upon number of fatalities per flying mile), YOU, the individual driver will be monetarily penalized through your insurance company for wanting to pilot your own vehicleYou'll be the risk, see?

 

Pretty slick.

 

In reply to by J S Bach

Kylescotch Chupacabra-322 Tue, 05/29/2018 - 18:35 Permalink

With the number of emergency vehicles being hit, I'm surprised the police and fire unions aren't taking action.  The inability of a Tesla to react properly to an emergency vehicle is a serious problem, and will get worse in proportion to the number of Teslas on the road.  If Teslas were more common, they could easily become the #1 killer of public safety workers.  And every time it hits a fire truck, it will cost a city tens to hundreds of thousands of dollars.  So Fire Chiefs are gonna blow a gasket sooner or later.

In reply to by Chupacabra-322