Tesla Accident: The United States Determined That The Company’s 12th Autopilot Crashed In An Emergency Vehicle-Related Accident.
If Tesla does not fully answer NHTSA’s questions, it may face a civil fine of up to 115 million U.S. dollars (approximately 840 million rupees).
The U.S. automobile safety regulator said on Wednesday that they have discovered the 12th incident involving Tesla vehicles using advanced driver assistance systems in emergency vehicle-related accidents, and asked the automaker to answer detailed questions about your autonomous driving system
National Highway Traffic Safety Administration (NHTSA) On August 16, the company stated that after 11 crashes, it had launched a formal safety investigation into Tesla’s driver assistance system Autopilot. The study covers 765,000 American Tesla vehicles produced between 2014 and 2021.
NHTSA said that the 12th occurred in Orlando on Saturday. The agency sent an 11-page detailed letter to Tesla on Tuesday, which contained many questions that it had to answer as part of the investigation.
Tesla’s Autopilot handles some driving errands and permits drivers to save their hands off the wheel for broadened periods. Tesla says Autopilot empowers vehicles to control, speed up and brake consequently inside their path.
Tesla didn’t react to a solicitation looking for input. The organization could confront common punishments of up to $115 million (generally Rs. 840 crores) in the event that it neglects to completely react to the inquiries, NHTSA said.
Tesla shares shut down 0.2 percent at $734.09 (generally Rs. 53,630 crores) on Wednesday.
On Saturday, the Florida Highway Patrol said the vehicle of a Florida trooper who had halted to help an impaired driver on a significant roadway was struck by a Tesla that the driver said was in Autopilot mode. As per a police report delivered on Wednesday, the trooper “barely missed being struck as he was outside of his watch vehicle.”
NHTSA said before it had reports of 17 wounds and one passing in the 11 accidents. A December 2019 accident of a Tesla Model 3 remaining a traveler dead after the vehicle slammed into a left fire engine in Indiana.
NHTSA’s solicitation for data requests that Tesla detail how it recognizes and reacts to crisis vehicles, just as glimmering lights, street flares, cones and barrels and to detail the effect of low light conditions.
NHTSA said already that a large portion of the 11 episodes happened into the evening.
Tesla in July presented a possibility for certain clients to buy in to its high level driver help programming, named Full Self-Driving ability. Tesla said the current provisions “don’t make the vehicle independent.”
NHTSA is looking for data on the “date and mileage at which the Full Self Driving (FSD) alternative was empowered” for all vehicles, alongside all shopper objections, field reports, crash reports and claims.
The office additionally needs Tesla to clarify how it forestalls utilization of the framework outside regions where it is planned.
Among the point by point questions, NHTSA likewise requested that Tesla clarify “testing and approval needed preceding the arrival of the subject framework or an in-field update to the subject framework, including equipment and programming parts of such frameworks.”
Tesla should react to NHTSA’s inquiries by October 22, it said, and it should reveal plans for any progressions to Autopilot inside the following 120 days.