U.S. identifies 12th Tesla assisted systems car crash involving emergency vehicle

Tesla logo.   | Photo Credit: AFP

U.S. auto safety regulators said on Wednesday they had identified a 12th crash involving Tesla Inc vehicles that were using advanced driver assistance systems in incidents involving emergency vehicles.

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

The National Highway Traffic Safety Administration (NHTSA) on Aug. 16 said it had opened a formal safety probe into Tesla driver assistance system Autopilot after 11 crashes.

The 12th occurred in Orlando on Saturday, NHTSA said. The agency sent Tesla an 11-page letter with questions, dated Tuesday, as part of its investigation.

Tesla's Autopilot handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods. Tesla says Autopilot enables vehicles to steer, accelerate and brake automatically within their lane.

Also Read | Tesla drops radar; is Autopilot system safe?

Tesla did not immediately respond to a request seeking comment.

On Saturday, the Florida Highway Patrol said a Florida trooper who had stopped to assist a disabled motorist on a major highway was struck by a Tesla.

The Florida Highway Patrol said a Tesla in Autopilot model struck the patrol car. "Trooper was outside of car and extremely lucky to have not been struck," the agency said in a tweet.

Also Read | Tesla’s ‘Full Self-Driving’ under review in U.S.

NHTSA said earlier it had reports of 17 injuries and one death in the 11 crashes, including the December 2019 crash of a Tesla Model 3 that left a passenger dead after the vehicle collided with a parked fire truck in Indiana.

Tesla in July introduced an option for some customers to subscribe to its advanced driver assistance software, dubbed "Full Self-Driving capability." Tesla says the current features "do not make the vehicle autonomous."

Also Read | Musk says Tesla's self-driving software update 'not great'

Among the questions NHTSA wants Tesla to answer is the "date and mileage at which the 'Full Self Driving' (FSD) option was enabled" for all vehicles along with all consumer complaints, field reports, crash reports and lawsuits.

NHTSA also wants Tesla to explain its "methods and technologies used to prevent subject system usage outside" the operational design domain.

NHTSA also asked Tesla to explain "testing and validation required prior to the release of the subject system or an in-field update to the subject system, including hardware and software components of such systems."

Also Read | U.S. to require crash reports for driver-assistance autos

NHTSA also asked Tesla to disclose any modifications or changes that "may be incorporated into vehicle production or pushed to subject vehicles in the field within the next 120 days."

Tesla must respond to NHTSA's questions by Oct. 22, it said.

Our code of editorial values

This article is closed for comments.
Please Email the Editor

Printable version | Oct 20, 2021 7:11:42 PM |

Next Story