Tesla Phantom Braking & Pedestrian Safety Concerns Continue – CleanTechnica


Tesla Phantom Braking & Pedestrian Safety Concerns Continue - CleanTechnica


Tesla says it will begin robotaxi service in Austin, Texas, within the next few days. Maybe so. The beginning of the robotaxi service has already been delayed once. When it does begin, it will use a small fleet of Tesla Model Ys equipped with what the company calls its Full Self Driving Unsupervised software. How that differs from the Full Self Driving (Supervised) software currently available to Tesla owners is unclear. Tesla has gone to great lengths to shroud its robotaxi program in secrecy.

That alone is giving some people pause. The company claims the secrecy is to protect its proprietary business interests, but skeptics suspect it has more to do with not making incidents in which the cars fail to perform as intended from becoming public knowledge. No one would step into an elevator unless they were sure it would ascend or descend without incident. But Tesla has routinely brushed aside concerns about safety, claiming its systems are safer than human drivers and so if there some unfortunate incidents along the way, well, that’s unfortunate but you can’t make an omelet without breaking a few eggs.

There are some groups who are dedicated anti-Tesla organizations. One of them is the Dawn Project and another is called Tesla Takedown. Recently, they conducted an experiment with a Model Y equipped with the latest version of Full Self Driving (Supervised). It shows the Tesla approaching a stopped school bus with its flashing warning lights on. The Tesla ignored the bus and continued on.

Then a cardboard cutout about the size of a small child dashes out from behind a parked car, the way a young child might when running to catch a school bus. The Tesla braked too late to avoid slamming into the cutout, stopped, then continued on as if nothing happened. The experiment was repeated eight times and the result was the same each time. The Tesla consistently failed to notice the stopped school bus and struck the child-like cutout. The obvious question is, are these cars safe enough to be operated on public streets? Here’s the video. Judge for yourself.

Tesla Has A School Bus Problem

Now, we know this protocol was set up by those with an ax to grind. The Dawn Project was founded by Dan O’Dowd, who is the CEO of a company that offers competing automated driving system software. He has previously taken out ads warning that Full Self-Driving (Supervised) would fail to react appropriately when it encounters stopped school buses.

We also know that the Tesla robotaxis in Austin will be supervised by people in a remote command center, but we all are keenly aware of the horrific accident that took place in Arizona several years ago when an Uber test driver struck and killed a pedestrian crossing a highway at night.

Elon may see such things as abstractions that detract from his primary obsession, which is self driving cars. He posted on his personal anti-social media channel recently that the “first Tesla that drives itself from factory end of line all the way to a customer house is June 28.” Presumably, the customer lives fairly close to the factory.

Inside EVs tried to take a fairly balanced approach to the controversy created by the video. “For what it’s worth, the Model Y, which was traveling at around 20 mph, did apply the brakes and did come to a full stop. It just wasn’t enough to avoid hitting the kid-sized mannequins that were crossing the street. Some commenters on X pointed out that not even humans would have been able to stop in time. Others said that the car could be smart enough to differentiate between a dummy and a real child, with at least one person posting a video where a Tesla driving with FSD enabled stopped just fine when a child was about to cross the street.”

Is this a tempest in a teapot? Some may think so. Darting child cases are a staple of law school personal injury discussions. It is unlikely even a human controller would react in time to prevent a child dashing across the street from being struck. But that still does not explain the failure to notice a stopped school bus, which is disturbingly similar to a tendency Teslas have for running into the back of emergency vehicles.

During a break for watercress sandwiches (with the crusts cut off, naturally), the gang at CleanTecnica world headquarters did question whether the software — which is 100% based on camera input — is sophisticated enough to distinguish between a cardboard cutout that looks like a child and an actual child. We also wonder if there are any parents who would be willing to volunteer their kids to be human guinea pigs to test that hypothesis.

The Tesla Phantom Braking Phenomenon

There is an ancillary issue involving Tesla, this one having more to do with its so-called Autopilot technology. Autopilot is not FSD. It is more of an advanced cruise control that adds lane centering and the ability to negotiate gentle turns on the highway. But it also incorporates something known as automatic emergency braking.

Many automakers have automatic braking systems, which are required by safety regulators in many countries. But those systems only operate at relatively low speeds. The Tesla emergency braking is operational at speeds up to 124 mph. Sometimes, the system throws out the anchor for no apparent reason, a situation known as “phantom braking.” It does’t take a genius to understand that if the car in front of you slows suddenly, the chances of a rear end collision are greatly increased, with potentially serious consequences.

It is none too pleasant for the driver and any passengers in the Tesla, either. Think of it as getting on an elevator on the first floor and pressing the button for the 19th floor.The doors shut and the elevator begins its ascent, when suddenly it reverses direction and plunges into the third sub-basement. Few would find that a pleasant experience.

According to Autoblog, a federal lawsuit in the US claims Tesla “knew about phantom braking complaints as early as 2015 but kept drivers in the dark—possibly to avoid recalls or regulatory scrutiny.” A central question in the litigation is whether Tesla promoted a feature it knew could fail when drivers needed it most.

10,000 Tesla owners in Australia are also suing Tesla, claiming the company misled them about the safety and reliability of the Autopilot system. Plaintiffs allege that even with hands on the wheel, their cars decelerate without cause, creating “rear-ender nightmares.” One driver recounted a “scary ride” after nearly being hit by a truck when his Tesla braked unexpectedly.

In 2022, the National Highway Traffic Safety Administration received 758 complaints about phantom braking incidents involving Tesla vehicles decelerating for no reason (NHTSA ODI, 2022). Drivers describe their vehicles “slamming the brakes” unexpectedly.

Elon Musk is known for talking big and making grand claims, but he also hides behind a wall of secrecy, which many find disturbing. What is it you are trying to hide, Elon? Remember he is the person who ordered Tesla employees to fake a video that was featured prominently on the Tesla website for years that purported got show a Tesla driving itself. He is also the person who blamed the cold blooded assassination of four people in Minnesota on “far left” radicals.

I own a Tesla Model Y. It is a fine car but not a great car. I will not use the Autopilot because it requires more of my attention than just driving the car itself. When I am driving across Florida on a two lane highway with an 18-wheeler on my tail, I am uncomfortable knowing the car could brake suddenly. That concern far outweighs any lane centering or speed maintaining benefits. Would I step into a Tesla robotaxi and let it drive me anywhere? No, not unless I had taken Prozac first — or possibly a dose of ketamine.

Tesla’s effort to hide its robotaxi service under a cone of silence and Musk using his drag with the president to roll back federal reporting requirements for semi-autonomous cars makes me hesitant to be part of his beta testing protocols. Find someone else, Elon. I want as little to do with you as possible.


Sign up for CleanTechnica’s Weekly Substack for Zach and Scott’s in-depth analyses and high level summaries, sign up for our daily newsletter, and follow us on Google News!


Whether you have solar power or not, please complete our latest solar power survey.



Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.


Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one on top stories of the week if daily is too frequent.


Advertisement



 


CleanTechnica uses affiliate links. See our policy here.

CleanTechnica’s Comment Policy