The US National Highway Traffic Safety Administration (NHTSA) announced on Thursday that it is opening an engineering analysis into Tesla‘s Full Self-Driving (FSD) software.
The probe covers about 3.2 million vehicles across multiple models in the United States that are equipped with the system, produced from 2023 to 2025.
According to the agency, the investigation will focus on potential failures in FSD’s ability to detect when the system is degrading or not performing properly.
The probe was triggered by crashes and near-misses where the system failed to recognize driver inattention.
This is the latest in a series of NHTSA investigations into Tesla over the past year, most of which have centered on its self-driving technology.
Earlier this month, Tesla reached the deadline to submit critical data to the NHTSA, after the regulator opened a probe into its Full Self-Driving (Supervised) software for potential traffic violations.
System Failures
The escalation comes a day after CEO Elon Musk wrote on X that a fatal Cybertruck accident was not caused by the company’s assisted driving software Autopilot.
According to the Chief Executive, vehicle logs show the driver of a Cybertruck involved in a violent overpass crash in Houston last August disengaged Autopilot four seconds before impact.
The statement directly challenges the framing of a $1 million lawsuit filed by the driver, Justine Saint Amour, whose attorney has argued that the crash was caused by Tesla‘s system failing to navigate a standard highway curve — not by anything the driver did wrong.
The Houston case adds to a mounting wave of legal action against Tesla’s driver-assistance systems.
A federal judge recently upheld a $243 million verdict against Tesla in a separate Autopilot case involving the death of a 22-year-old woman.
Prior Investigations
The National Highway Traffic Safety Administration is currently investigating 2.88 million Tesla vehicles equipped with Full Self-Driving after connecting 58 incidents to the system.
Late last year, the NHTSA opened a probe on Tesla over reports of its FSD-equipped cars running red lights and not respecting speed limits, under the new ‘Mad Max’ driving mode.
According to the agency, the investigation followed more than 50 reports of traffic-safety violations and several vehicle crashes caused by the driver assistance software.
The company has asked for extra time to review the data before submission.
By then, Tesla highlighted the strain of dealing with multiple NHTSA investigations simultaneously, including separate probes into delayed crash reporting and defective door handles.
FSD Deployment
Tesla‘s CEO Elon Musk revealed in late January that Full Self-Driving (FSD) is now “100% unsupervised.”
By then, the chief executive confirmed that the company had removed the chase vehicles that had accompanied early robotaxi trips operating without safety monitors earlier this year.
He flagged, however, that Tesla was “just being very cautious with the rollout” for customers.
Tesla‘s VP of AI Software Ashok Elluswamy wrote on X earlier this month that the FSD’s “upcoming tech will make it even more” transformative, commenting on a post that said it would be the most “impressive tech creation” of the century.









