the new driverless parking feature has already caused dozens of accidents

Elon Musk wants to convert Tesla in a self-driving car firm, which considers the business of the future. But your Autonomous driving technology remains under scrutiny by NHTSA (the National Highway Traffic Safety Administration). The security agency has opened the umpteenth investigation to this system. This time, for its Smart Summon feature which allows the car to be summoned to leave the parking lot.

In total there are more than 2.6 million Tesla models under the focus of this new investigation. The findings will determine whether another recall is necessary to update this feature of the Autopilot FSD. Tesla already recalled more than 2.0 million of its cars a year ago as a proactive response to another investigation by the US DGT.

All Model 3 and Model Y equipped with Autopilot FSD investigated

In September, Tesla launched the Actually Smart Summon. This is the definitive version of Smart Summon, which has been active in beta since 2019 in Tesla cars equipped with Autopilot FSD. As usual, Tesla used the owners of its cars as beta-testers and in its embryonic phase it already received many criticisms and complaints. “The car was wandering like a drunk driver,” one of the owners commented at the time.

El Smart Summon permite summon the car at a maximum of 65 m from the Tesla app (range that has been extended in the definitive option). The mark indicates that it should be activated in safe environmentssuch as parking lots and driveways of private properties, and not on public roads. It is, however, a mere recommendation.

Tesla Model Y with Smart Summon activated
Tesla Model Y with Smart Summon activated

Determine if Smart Summon is too permissive. NHTSA details that users have reported multiple accidents both with the definitive Actually Smart Summon and with the beta Smart Summon: the cars, with this system activated, collided with poles or other cars that they did not detect or their owners could not stop in time.

According to the organization, users explained that they had very little reaction timeeither to stop the car from the app or because they did not have adequate vision. In several of the cases, the impacted cars had an occupant inside. The most recent was recorded a few days ago: a Tesla Model Y suddenly turned in a parking lot in Meryland and ended up crashing into a parked car, with a woman inside. “There was no one in the car, no one driving, no passengers; it was an empty car that hit my car. And it kept going after the impact.”

The models that NHTSA is investigating are all Tesla Model 3 and Tesla Model Y sold in the US to date equipped with Autopilot FSD (Model 3 2017-2025 and Model Y 2020-2025), as well as the Model S and X model 2016-2025. It is estimated that there are about 2,585,000 cars.

What the NHTSA will evaluate in this analysis is the maximum speed at which the Actually Smart Summon acts in addition to how the app operates: they will do tests with different distances and lines of sight, they will examine the braking distance, its ability to respond to different unforeseen events. either if it allows you to activate it on the road or environments for which it was not designedamong others.

Given that the function itself has no limitation, as Tesla trusts it to the user’s good work by recommending where and how it should be used, it is very likely that the NHTSA will end up requiring an update for this function. The agency’s latest investigation into Autopilot FSD, which preliminarily began in May 2024 and was effective in octoberprecisely sought to determine if the technology was too permissive with the driver’s lack of attention even after the update that Tesla made and which already added limitations.

Tesla Cybercab
Tesla Cybercab

Tesla Cybercab

Tesla, autonomous driving and its battle with the NHTSA. According to The Washington Postthe US DGT has indicated that Tesla has not properly notified the Smart Summon incidents to the security agency, despite the fact that car brands are required to do so by law. As of 2021, an NHTSA order went into effect which requires manufacturers to report any incidents with semi-autonomous driving systems activated.

The objective of this regulation is to achieve greater transparency in the actions of these advanced technologies that, although they aim to improve safety, have also been involved in serious accidents. Some of them deadly. Tesla has announced that Smart Summon is also available now in Europealthough limited to 6 m of action.

The Californian is the brand that this regulation is giving the most headaches. This is because their cars transmit incidents in real time with Autopilot activated, unlike the rest of the companies. To which is added that all their cars have this technology, more or less advanced, so there are more of them than those of other manufacturers. According to the NHTSA, Tesla Autopilot contributed to at least 467 crashes, 13 of them resulting in fatalities and “many others” resulting in serious injuries.

Donald Trump has on the table overturning this regulationwhich will benefit Tesla. And not only to avoid recalls and technical adjustments to the Autopilot FSD and its functions, but also to pave the way for its robotaxis, which it intends to begin operating in 2026.

Under current regulations, their Tesla Cybercabs, without pedals or steering wheel, would need authorization from the NHTSA to be able to provide service. And the agency has never given the green light to this type of driverless cars for the transportation of passengers.

Source: www.motorpasion.com