Tesla Cybertruck with FSD hits a fake child without hesitation

During an independent safety test of Dirty Tesla showed that a Tesla Cybertruck with Full-Self Driving (FSD) continued driving several times when a doll or small objects were placed in its path. It is therefore a wake-up call for anyone who thinks that FSD will automatically and timely recognize and avoid obstacles and people.

The tests were conducted during the day on a dirt road. Objects such as a white bucket, a small box, and an exercise ball were placed in the Cybertruck’s path. You can guess the result by now: the truck, weighing almost 3,000 kilos, drove over these objects without stopping, without detecting them.

However, larger objects, such as a children’s bicycle, were detected thanks to the Autonomous Emergency Braking (AEB) system, which intervened to avoid a collision.

Doll of eight-year-old child was not recognized

The real challenge, however, came with a doll that looked like an eight-year-old child. In that case, the system displayed an unclear and ghostly image on the screen and failed to recognize the ‘child’. Even when the poor doll raised her arms, she remained invisible to FSD.

In other tests, where the researcher stood in front of the car, FSD also responded unpredictably. In one case, the Cybertruck maneuvered just past him, while in another attempt the car stopped in time thanks to the built-in AEB system.

Tesla’s system is not fully autonomous

Although Tesla promotes FSD as an advanced driving system, it is officially classified as a Level 2 assistance system. This means that it is not fully autonomous and requires constant supervision by the driver. Despite these limitations, Tesla continues to actively promote FSD, as in the case of the Tesla Robotaxi.

This project, which Elon Musk unveiled at an event in Los Angeles, is intended to enable a self-driving taxi network. However, safety experts and scientists have long warned about FSD’s inconsistencies, especially given the dozens of fatalities and hundreds of crashes involving the system.

Other priorities

If a similar situation occurs at another car manufacturer, the fuss will be incalculable. At Tesla, however, it seems to be dismissed as ‘collateral damage’. Musk seems to view incidents as an opportunity to learn and collect data rather than as a necessity to guarantee safety. There is something to be said for that, but the question remains where experimentation ends and where responsibility begins. Tesla could also conduct these tests more thoroughly and on its own initiative, rather than taking risks with customers as guinea pigs. Meanwhile, Tesla is exploring ways to ease regulations surrounding FSD through the Department of Government Efficiency (DOGE), a new initiative led by Musk.

Follow more news about Tesla and don’t miss anything with the Bright WhatsApp channel.

Source: www.bright.nl