"Far Less Competent Than A Human": Consumer Reports Slams Tesla's Autopilot

At a time when the safety of Tesla's Autopilot driver-assistance technology is under growing scrutiny, following a former NHTSA head calling for a recall and several fatal  accidents associated with Autopilot, the influential Consumer Reports has weighed in with their take on the company's Navigate feature, a lane changing assistance feature included with the latest Autopilot update. What Consumer Reports found was that Tesla's automated software was "far less competent than a human driver".

Tesla's recent Autopilot update was supposed to allow certain cars to automatically change lanes in an attempt to make driving "more seamless". But Consumer Reports refuted the narrative, instead saying that the feature simply "doesn’t work very well and can create potential safety risks for drivers."

The feature was added last month as part of a promised upgrade to a package of driver assist features. To use it, the driver essentially gives the car permission to make its own lane changes and can cancel an automated lane change at any time by using the turn-signal stalk, braking, or holding the steering wheel in place.



The independent test found that the Navigate feature "lagged far behind a human driver’s skill set."

The report found that the feature cut off cars without leaving space and even passed other cars in ways that violate state laws. As a result, the driver often had to intervene and prevent the system for making poor decisions.

Jake Fisher, Consumer Reports’ senior director of auto testing slammed the technology: 

“The system’s role should be to help the driver, but the way this technology is deployed, it’s the other way around. It’s incredibly nearsighted. It doesn’t appear to react to brake lights or turn signals, it can’t anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it."

CR said that multiple testers reported that the Tesla "often changed lanes in ways that a safe human driver would not—cutting too closely in front of other cars, and passing on the right." The article also expressed concern about Tesla's claims that three rearward-facing cameras could detect fast approaching objects.

Fisher continued: "The system has trouble responding to vehicles that approach quickly from behind. Because of this, the system will often cut off a vehicle that is going a much faster speed since it doesn’t seem to sense the oncoming car until it’s relatively close."

Fisher also said that merging is an issue for the software: “It is reluctant to merge in heavy traffic, but when it does, it often immediately applies the brakes to create space behind the follow car—this can be a rude surprise to the vehicle you cut off.”

"In essence, the system does the easy stuff, but the human needs to intervene when things get more complicated," Fisher continued.

From a legal standpoint, the testers also had concerns. Several testers experienced Navigate initiate a pass on the right on a two-lane divided highway, a move that could result in a ticket for an "improper pass" in some states. It also failed to return to the right-hand travel lane after making a pass several times, testers said. 

Dorothy Glancy, a law professor at Santa Clara University School of Law in California who focuses on transportation and automation stressed the fact that automated driving software must be programmed to follow local laws. 

She said: “One of the issues we lawyers are looking at is the obligation of autonomous vehicles to obey all traffic laws where the vehicle is being used. That can get tricky when there are variations from area to area, even within a state—for example, municipal speed limits.”

Shiv Patel, an automotive analyst at market research firm ABI Research said he believes that Tesla's hardware is operating near full capacity: “I would say that Navigate on Autopilot would be pushing the upper limits of what could be achieved with the current computer hardware in the vehicle.” 

The review then goes on to state the obvious: despite Tesla promising will have full self driving next year, their experience with Navigate suggests it will take longer.

Aside from the massive safety risks associated with Autopilot that we have been sounding the alarm on for years, could it also be that two more Elon Musk promises - self driving next year and millions of robotaxis on the road - may (gasp) wind up not materializing? While we won't be surprised, Tesla cultists investors who still believe in Musk very well may be.