Tesla Recall Won’t Fix Autopilot Problems, Critics Say – CleanTechnica

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!


Less than a week ago, Tesla announced it was recalling nearly 2 million cars sold in the US to address concerns about its Autopilot software. Now let’s be clear. While this is technically a recall, it will all be performed via an over the air update. Elon hates it that OTAs are classified as recalls, but rules are rules. In any event, the good news for Tesla drivers is that they won’t need to visit a Tesla service center and wait for the update to be completed, so maybe Elon has a point. The nub of the issue is that Tesla says:

“Basic Autopilot is a package that includes SAE Level 2 advanced driver assistance features, including Autosteer and Traffic-Aware Cruise Control (TACC), that drivers may choose to engage subject to certain defined operating limitations. Autosteer is an SAE Level 2 advanced driver-assistance feature that, in coordination with the TACC feature, can provide steering, braking and acceleration support to the driver subject to certain limited operating conditions.

“Autosteer is designed and intended for use on controlled-access highways when the feature is not operating in conjunction with the Autosteer on City Streets feature. When Autosteer is engaged, as with all SAE Level 2 advanced driver-assistance features and systems, the driver is the operator of the vehicle. As the vehicle operator, the driver is responsible for the vehicle’s movement with their hands on the steering wheel at all times, remaining attentive to surrounding road conditions, and intervening (e.g., steer, brake, accelerate or apply the stalk) as needed to maintain safe operation.

“When Autosteer is engaged, it uses several controls to monitor that the driver is engaged in continuous and sustained responsibility for the vehicle’s operation as required. If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage.”

Critics Say Autopilot Update Is Not Enough

The problem is, a Washington Post investigation indicated that Autopilot can be activated in situations where Tesla says it shouldn’t be used. The OTA software will add more bells, whistles, flashing lights, and waving flags to warn drivers that they may be using Autopilot inappropriately, but critics point out nothing has been done to actually prevent misuse of the system.

“What a missed opportunity,” Matthew Wansley, a professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies, told the Washington Post after details about the update were revealed.  “I have yet to see Tesla, or anyone defending Tesla, come up with an argument for why we should be letting people use [Autopilot] on roads that could have cross traffic. That’s how a lot of these crashes are happening.”

“It’s far from sufficient,” said Senator Richard Blumenthal of Connecticut, who has been a frequent Tesla critic. He said regulators should have required more significant changes to the software, given its history of crashes. “Relying on self-enforcement is really problematic given the company’s statements about how seriously they take the whole recall system, the comments by Elon Musk. They regard recalls as more of entertainment than enforcement,” Blumenthal said. “When a car is going to hit an obstacle or another car or go off the road or hit a barrier, there ought to be more than just voluntary compliance.”

Did NHTSA Go Easy On Tesla?

Officials and lawmakers expressed concern that NHTSA may have been reluctant to come down harder on the automaker, which has enormous influence over the country’s transition to electric vehicles, which is a priority for the Biden administration. However, NHTSA said its investigation into Autopilot remains open and some Tesla critics held out hope that the recall may not be the agency’s final action.

In a statement, NHTSA spokeswoman Veronica Morales said, “It is now Tesla’s responsibility under the law to provide a remedy, free of charge to consumers, that fully addresses the safety defect.” Last week, Tesla said it has a “moral obligation” to continue improving its safety systems and also said that it is “morally indefensible” to not make these features available to a wider set of consumers.

As part of the recall, Tesla agreed to issue a software update that contained new “controls and alerts,” such as “additional checks” when drivers are activating the features outside controlled-access highways. The update also will suspend drivers’ ability to use Autosteer if they repeatedly fail to stay engaged while using it, with eyes on the road and hands on the wheel.

The Operational Design Domain

Nowhere in the recall language, however, does the company say it will restrict the technology to its so-called Operational Design Domain (ODD), the industry term for the specific locations and set of circumstances for which Autopilot is designed. That means consumers will still be able to engage the feature outside the ODD and will simply experience more alerts and precautions when they do, the Washington Post notes. In a statement to The Post last week, NHTSA said it would be too complex and resource intensive to verify that systems such as Tesla Autopilot are used within the ODD. It also expressed doubt that doing so would fix the problem.

Tesla critic Dan O’Dowd, who has pushed for the company’s software to be banned through his advocacy group the Dawn Project, said the recall fell short. “The correct solution is to ban Tesla’s defective software, not to force people to watch it more closely,” he said in a statement. “NHTSA’s recall misses the point that Tesla must address and fix the underlying safety defects in its self-driving software to prevent further deaths.”

Jennifer Homendy, the chair of the National Transportation Safety Board has been critical of the approach taken by regulators at NHTSA. She said she was pleased to see the agency take action, though it comes seven years after the first known Autopilot fatality. “They need to verify that the change being made is being made. And then with a voluntary recall, how do you verify?” Veronica Morales said that NHTSA will test several Teslas at a vehicle center in Ohio to “evaluate the adequacy of remedies.”

Gene Munster, a managing partner at Deepwater Asset Management, said he doesn’t expect this recall to deter Tesla from aggressively charging ahead on Musk’s vision of fully autonomous driving. “People will still use [Autopilot],” he said. “I don’t think that NHTSA has made the roads measurably safer by having these notifications, and I don’t think that Tesla is going to slow its pursuit because of this.”

The Takeaway

This is a hugely contentious issue. Our story last week got almost 200 comments, some of which proposed Elon Musk for sainthood while others suggested quite the opposite. Around the CleanTechnica smorgasbord, the consensus is that if the software knows it is not on a controlled access highway, then it should not activate. Period. Full stop. We agree Tesla has a “moral obligation” to continue improving its safety systems. Tesla knows how to fix the issue. It should just get on with it.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.


Our Latest EVObsession Video


I don’t like paywalls. You don’t like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it!! So, we’ve decided to completely nix paywalls here at CleanTechnica. But…

 

Like other media companies, we need reader support! If you support us, please chip in a bit monthly to help our team write, edit, and publish 15 cleantech stories a day!

 

Thank you!


Advertisement



 


CleanTechnica uses affiliate links. See our policy here.