DETROIT — U.S. safety investigators want to know why Tesla didn’t file recall documents when it updated Autopilot software to better identify parked emergency vehicles, escalating a simmering clash between the automaker and regulators.
In a letter to Tesla, the National Highway Traffic Safety Administration told the electric car maker Tuesday that it must recall vehicles if an over-the-internet update mitigates a safety defect.
“Any manufacturer issuing an over-the-air update that mitigates a defect that poses an unreasonable risk to motor vehicle safety is required to timely file an accompanying recall notice to NHTSA,” the agency said in a letter to Eddie Gates, Tesla’s director of field quality.
The agency also ordered Tesla to provide information about its “Full Self-Driving” software that’s being tested on public roads with some owners.
The latest clash is another sign of escalating tensions between Tesla and the agency that regulates partially automated driving systems.
In August the agency opened an investigation into Tesla’s Autopilot after getting multiple reports of vehicles crashing into emergency vehicles with warning lights flashing that were stopped on highways.
The letter was posted on the NHTSA website early Wednesday. A message was left early Wednesday seeking comment from Tesla, which has disbanded its media relations department.
NHTSA opened a formal investigation of Autopilot in August after a series of collisions with parked emergency vehicles. The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the dozen crashes identified as part of the probe, 17 people were injured and one was killed.
According to the agency, Tesla did an over-the-internet software update in late September that was intended to improve detection of emergency vehicle lights in low-light conditions. The agency says that Tesla is aware that federal law requires automakers to do a recall if they find out that vehicles or equipment have safety defects.
The agency asked for information about Tesla’s “Emergency Light Detection Update” that was sent to certain vehicles “with the stated purpose of detecting flashing emergency vehicle lights in low light conditions and then responding to said detection with driver alerts and changes to the vehicle speed while Autopilot is engaged.”
The letter asks for a list of events that motivated the software update, as well as what vehicles it was sent to and whether the measures extend to Tesla’s entire fleet.
It also asks the Palo Alto, California, company whether it intends to file recall documents. “If not, please furnish Tesla’s technical and/or legal basis for declining to do so,” the agency asks.
When automakers find out about a safety defect, they must tell NHTSA within five working days, and they’re required to do recalls. NHTSA monitors the recalls to make sure they cover all affected vehicles and the automakers are making proper efforts to contact all owners.
Tesla has to comply with the request by Nov. 1 or face court action and civil fines of more than $114 million, the agency wrote.
In a separate special order sent to Tesla, NHTSA says that the company may be taking steps to hinder the agency’s access to safety information by requiring drivers who are testing “Full Self-Driving” software to sign non-disclosure agreements.
The order demands that Tesla describe the non-disclosure agreements and how they are signed by Tesla drivers. The company also must say whether Tesla requires owners of vehicles equipped with Autopilot to agree “to any terms that would prevent or discourage vehicle owners from sharing information about or discussing any aspect of Autopilot with any person other than Tesla.”
Responses must be made by a Tesla officer under oath. If Tesla fails to fully comply, the order says the matter could be referred to the Justice Department for court action to force responses. It also threatens more fines of over $114 million.
Tesla has said that neither vehicles equipped with “Full Self-Driving” nor Autopilot can drive themselves. It warns drivers that they must be ready to intervene at all times.
It was unclear how Tesla and mercurial CEO Elon Musk will respond to NHTSA’s demands. The company and Musk have a long history of sparring with federal regulators.
In January, Tesla refused a request from NHTSA to recall about 135,000 vehicles because their touch screens could go dark. The agency said the screens were a safety defect because backup cameras and windshield defroster controls could be disabled.
A month later, after NHTSA started the process of holding a public hearing and taking Tesla to court, the company agreed to the recall. Tesla said it would replace computer processors for the screens, even though it maintained there was no safety threat.
Musk fought with the Securities and Exchange Commission over a 2018 tweet claiming that he had financing to take Tesla private, when that funding was not secured. He and the company agreed to pay $20 million each to settle allegations that he misled investors. Later the SEC sought to hold Musk in contempt of court for tweeting a misleading projection of how many cars Tesla would manufacture. Musk branded the SEC the “shortseller enrichment commission,” distorting the meaning of its acronym. Short sellers bet that a stock price will fall.
The new demands from NHTSA signal a tougher regulatory stance under President Joe Biden on automated vehicle safety compared with the previous administrations. The agency had appeared reluctant to regulate the new technology for fear of hampering adoption of the potentially life-saving systems.
The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.