ADVERTISEMENT

Tesla's self-driving test software under review after owner complaint

Updated - November 15, 2021 01:36 pm IST

Published - November 13, 2021 10:56 am IST

The car “gave an alert half way through the turn” and the driver tried to assume control “but the car by itself took control and forced itself into the incorrect lane,” the report says.

Tesla logo.

The U.S. National Highway Traffic Safety Administration (NHTSA) said it is looking into a consumer report that a Tesla Model Y was involved in an accident while using the company's Full Self-Driving (FSD) Beta software.

(Sign up to our Technology newsletter, Today's Cache, for insights on emerging themes at the intersection of technology, business and policy. Click here to subscribe for free.)

The owner of a 2021 Tesla Model Y reported to the auto safety agency that on Nov. 3 in Brea, California the vehicle was in FSD Beta mode "and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane."

ADVERTISEMENT

Also Read |

U.S. asks 12 automakers for assistance in Tesla probe

The car "gave an alert half way through the turn" and the driver tried to assume control "but the car by itself took control and forced itself into the incorrect lane," the report says. The car was severely damaged on the driver side, the owner added.

"NHTSA is aware of the consumer complaint in question and is in communication with the manufacturer to gather additional information," an NHTSA spokesperson told

ADVERTISEMENT

Reuters on Friday.

ADVERTISEMENT

Tesla did not immediately comment.

Earlier this month, Tesla recalled nearly 12,000 U.S. vehicles because of a communication error that could trigger a false collision warning or unexpected automatic emergency brake.

The recall was prompted after a software update to vehicles with FSD Beta . Tesla said more than 99.8% of the vehicles recalled as of Oct. 29 had installed a software update to address the issue and no further action was necessary.

FSD is an advanced driver assistance system that handles some driving tasks but Tesla says does not make vehicles completely autonomous. The features "require a fully attentive driver," it says.

Also Read | Musk’s fully autonomous Tesla car claim does not match reality

Last month, NHTSA raised concerns about how FSD was being used. "Despite Tesla's characterisation of FSD as 'beta,' it is capable of and is being used on public roads," NHTSA said.

NHTSA in August opened a formal safety probe into Tesla's Autopilot , a different driver assistance software system, after a dozen crashes involving Tesla models and emergency vehicles.

This is a Premium article available exclusively to our subscribers. To read 250+ such premium articles every month
You have exhausted your free article limit.
Please support quality journalism.
You have exhausted your free article limit.
Please support quality journalism.
The Hindu operates by its editorial values to provide you quality journalism.
This is your last free article.

ADVERTISEMENT

ADVERTISEMENT