Tesla's self-driving test software under review after owner complaint

The car “gave an alert half way through the turn” and the driver tried to assume control “but the car by itself took control and forced itself into the incorrect lane,” the report says.

Updated - November 15, 2021 01:36 pm IST

Published - November 13, 2021 10:56 am IST

Tesla logo.

Tesla logo.

The U.S. National Highway Traffic Safety Administration (NHTSA) said it is looking into a consumer report that a Tesla Model Y was involved in an accident while using the company's Full Self-Driving (FSD) Beta software.

(Sign up to our Technology newsletter, Today's Cache, for insights on emerging themes at the intersection of technology, business and policy. Click here to subscribe for free.)

The owner of a 2021 Tesla Model Y reported to the auto safety agency that on Nov. 3 in Brea, California the vehicle was in FSD Beta mode "and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane."

Also Read | U.S. asks 12 automakers for assistance in Tesla probe

The car "gave an alert half way through the turn" and the driver tried to assume control "but the car by itself took control and forced itself into the incorrect lane," the report says. The car was severely damaged on the driver side, the owner added.

"NHTSA is aware of the consumer complaint in question and is in communication with the manufacturer to gather additional information," an NHTSA spokesperson told Reuters on Friday.

Tesla did not immediately comment.

Earlier this month, Tesla recalled nearly 12,000 U.S. vehicles because of a communication error that could trigger a false collision warning or unexpected automatic emergency brake.

The recall was prompted after a software update to vehicles with FSD Beta . Tesla said more than 99.8% of the vehicles recalled as of Oct. 29 had installed a software update to address the issue and no further action was necessary.

FSD is an advanced driver assistance system that handles some driving tasks but Tesla says does not make vehicles completely autonomous. The features "require a fully attentive driver," it says.

Also Read | Musk’s fully autonomous Tesla car claim does not match reality

Last month, NHTSA raised concerns about how FSD was being used. "Despite Tesla's characterisation of FSD as 'beta,' it is capable of and is being used on public roads," NHTSA said.

NHTSA in August opened a formal safety probe into Tesla's Autopilot , a different driver assistance software system, after a dozen crashes involving Tesla models and emergency vehicles.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.