- A new video experiment shows Tesla’s FSD software failing to stop in time for a child crossing the street.
- The latest version of Full Self-Driving (Supervised) was tested in a new Model Y crossover.
- The experiment was carried out by The Dawn Project, an organization that’s openly against Tesla’s FSD.
The videos embedded below shows a brand new Tesla Model Y driving with the automaker’s so-called Full Self-Driving (Supervised) advanced driver assistance system engaged. It passes by a stopped school bus with an active stop sign and strikes a child-sized dummy that was supposed to cross the street.
The test was done eight times, and every time, the EV failed to stop in time. The videos posted on social media show that the driver was not touching the pedals and that the driver assistance system did not disengage. In fact, after driving over one of the mannequins, the car continues on its set route without driver input.
The test was meant to show that Tesla’s FSD software, the same that will underpin the company’s robotaxi efforts in Austin, Texas, is incapable of offering a basic safety net, especially around children. It’s emotional, but it’s also controversial.
The videos were posted by The Dawn Project, an organization backed by Dan O’Dowd, who is known by Tesla fans for his relentless campaign to discredit FSD. O’Dowd is the CEO of Green Hills Software, a company that provides software to car brands and other entities. Two years ago, The Dawn Project even bought a Super Bowl ad to discredit Tesla.
For what it’s worth, the Model Y, which was traveling at around 20 miles per hour, did apply the brakes and come to a full stop. It just wasn’t enough to avoid hitting the kid-sized mannequins that were crossing the street.
Some commenters on X pointed out that not even humans would have been able to stop in time. Others said that the car could be smart enough to differentiate between a dummy and a real child, with at least one person posting a video where a Tesla driving with FSD enabled stopped just fine when a child was about to cross the street.
The National Highway Traffic Safety Administration (NHTSA) has opened several investigations into Tesla’s driver assistance systems. One of them is looking into 2.4 million Teslas equipped with FSD after four reported collisions, including one that was fatal.
On June 22, Tesla will supposedly launch its Robotaxi service at a reduced scale in Austin, Texas. According to the company’s CEO, Elon Musk, Tesla is being “super paranoid about safety,” which is why the launch date could be postponed. Per Musk, the self-driving software powering the driverless Model Ys in Austin is the same as in any new Model Y on the street, which means the owners of these cars could potentially benefit from the same features as the robotaxis.
That has not happened yet, however, and it’s also worth noting that Tesla’s ride-hailing service relies on several tele-operators who keep things running smoothly from afar, in case something bad happens. Tesla owners don’t have this luxury and likely never will.
Read the full article here