Ford's Self-Driving Test Car Severely Damaged In Crash

It’s never good news when an autonomous automobile is involved in an accident. Lately, Alphabet’s Waymo crashed an autonomous bus in Las Vegas, and Uber managed to flip a self-driving Volvo in Arizona.

In the latest installment of autonomous car accidents across America, a self-driving test car from the Ford-backed startup Argo-AI was severely damaged Wednesday that sent two people to the hospital.

According to the Pittsburgh Post-Gazette, a box truck ran a red light about 10 a.m. at the 16th and Progress streets in Pittsburgh’s North Side and smashed into an Argo AI self-driving car with four people inside. Two of the four passengers in the Argo AI car were injured and taken to the hospital in stable conditions.

Alan Hall, a communications manager for Ford, who handles public relations on behalf of Argo AI, stated, “we’re aware that an Argo AI test vehicle was involved in an accident. We’re gathering all the information. Our initial focus is on making sure that everyone involved is safe.”

Hall offered limited information on whether the car was in self-driving mode during the accident, and or if the Argo AI fleet has been suspended.

In recent times, this is the second autonomous car crash in Pittsburgh. In September, Uber grounded its fleet of self-driving cars for a half day after one of its autonomous cars crashed. After an investigation, the company determined the car’s autonomous systems were not at fault during the accident.

In early 2017, Ford invested $1 billion in Argo AI, an artificial intelligence company that Ford has outsourced to build the brains in the company’s next generation of self-driving vehicles.  The startup anticipates the deployment of a fully driverless car, without a steering wheel or pedals, by 2021.

Back in November, we stated that just because its legal to test autonomous cars on public streets, doesn’t  necessarily mean they’ve been optimized for safety

Waymo published a report for California’s Department of Motor Vehicles about how frequently its driverless cars “disengaged” because of a system failure or safety risk and forcing a human driver to take over. In the report, Waymo said this happened once every 5,000 miles the cars drove in 2016, compared with once every 1,250 miles in 2015. While that’s certainly an improvement, these types of incidents are hardly rare.  



Shhh Whoa Dammit Sat, 01/13/2018 - 21:35 Permalink

No evidence of malfunction of AI vehicle. A box truck is huge compared to a person or huge car. The box truck  was large object easily recognized by sensors but ran the light.  The data program should recognize that the box truck was not stopping but the sedan could not accelerate out of the way. Fact is the safety record of AI vehicles will exceed regular vehicles. The AI vehicles will be sold with own insurance liability and collision. NOTE to Ford. The AI vehicle should be white not dark since light colors are recognized easier and have lower accident rate. 

In reply to by Whoa Dammit

RAT005 Shhh Sat, 01/13/2018 - 21:50 Permalink

I would love for my car to drive me everywhere, but not going to happen anytime soon.  Problem is how to get past the other cars that are driving with other (human) algorithms.  AI can't be safe when subjected to human chaos and until they are safe they can't be vast majority of vehicles.  The inability of humans put a glass ceiling on self-driving cars.  Must drive the development teams crazy :-(

In reply to by Shhh

Oliver Klozoff IH8OBAMA Sat, 01/13/2018 - 23:15 Permalink


I'm concerned that this is the kind of incident that the autonomutts won't mind broadcasting. AI good, human bad.

Since the courts have determined driving to be a "privilege", it is not so great a leap to see a petition in the not too distant future to no longer extend that privilege to humans as they are unfit to drive.


In reply to by IH8OBAMA

Nobody For President IH8OBAMA Sun, 01/14/2018 - 05:50 Permalink

Not necessarily - there have been plenty of accidents when someone runs a red light as they are semi-hidden behind a car in the intersection waiting to turn and you are starting through on the green and are totally blindsided. The Truck may have been speeding - we don't know any of the particulars of this accident and it is just presumptuous to assume a human driver would have avoided the accident. I worked ambulance for three years, and a lot of dumb shit happens at intersections...and in my day it was all human.

In reply to by IH8OBAMA

NoDebt Shhh Sat, 01/13/2018 - 21:52 Permalink

"a box truck ran a red light about 10 a.m. at the 16th and Progress streets in Pittsburgh’s North Side and smashed into an Argo AI self-driving car with four people inside."

You are being fed a line of shit.  Articles like this will be the justification why self-drving cars will be MANDATED, not just ALLOWED.  Can't leave anything in the hands of fallible humans.  Only computers can be trusted.  And guess who programs those computers.

When it starts, and it will soon, the proper questions to ask are:

1.  Why should I pay for liability insurance if I'm not driving the damned thing?  Bill Ford.  They were driving the fucking thing.

2.  Since I'm not driving I can be drunk or high out of my mind, right?  No such thing as DUI when there's no D.


In reply to by Shhh

PhilofOz Shhh Sat, 01/13/2018 - 22:03 Permalink

Why are half the cars on the road manufactured and encouraged in sales bitumen (asphalt) grey? Probably because they are fully aware that vehicles this colour are harder to see on the road especially at dawn/dusk or wet weather, and so a lot more accidents occur with them. It's bloody good for sales! Not so for the road toll!

In reply to by Shhh

slyder wood Shhh Sat, 01/13/2018 - 23:42 Permalink

Humans have been driving cars for ~100yrs. Manufacturers have sold flawed cars for the same amount of time. Current computer controlled vehicles are plagued by driveability problems. Just look at the number of TSBs for any given make and model. PCs have been around 30yrs, also flawed. Remember when Jobs said buying a computer was gonna be like buying a toaster, IOW, an appliance/black box. Siri, and other “ai” apps are not yet the bridge of the Enterprise. I don’t know how many promises SW people made that didn’t pan out. But then, they could always blame HW.

In reply to by Shhh

OverTheHedge slyder wood Sun, 01/14/2018 - 08:35 Permalink

I do remember Bill Gates saying 10 or 15 years ago, that soon, no one would need to own a computer, as all the computing would be done in the cloud - he was right, as everyone now uses phones instead.


Edge computing is where its at - no bandwidth!! so we have to get our computing power back out to where it's needed, and keep the cloud for managing just the long-term data. Put another way - cloud computing was a massive marketing con that allowed the mega-IT-corps to suck up ALL the commercial and private data, but not provide much in the way of a service. Get that PC back under your desk - you're going to need it.

In reply to by slyder wood

fingulas Shhh Sun, 01/14/2018 - 07:40 Permalink

AI vehicles don't have to be perfect, or even near so.  All an AI vehicle has to do is be marginally better than the average driver and the math wins out in its favor.  If you have 50% or less accidents with an AI vehicle then the math dominates the equation and AI will rapidly replace cost reduction alone will drive businesses into the AI model.


Face it folks, AI is going to be driving our cars and trucks...because most humans suck at it.







In reply to by Shhh

OverTheHedge fingulas Sun, 01/14/2018 - 08:41 Permalink

I am a truly excellent driver, but all the other drivers are mad idiots.

Unfortunately, all the mad idiots will also claim to be truly excellent drivers. Does this mean that I am NOT an excellent driver?

If people were any good at driving, this wouldn't happen:

(I actually know that I am not a particularly good driver - I've been doing it so long, I get bored, look at the scenery, don't pay attention. I drive accordingly: - s-l-o-w-l-y-)

In reply to by fingulas