The US National Highway Traffic Safety Administration (NHTSA) has closed its investigation into Tesla‘s Smart Summon feature without finding a safety-related defect, ending a 15-month preliminary evaluation that covered approximately 2.59 million vehicles.
The agency opened Preliminary Evaluation in January 2025, to examine crashes occurring during active Smart Summon sessions — a feature Tesla describes as a short-distance, SAE Level 2 system controlled by the driver from a smartphone and intended for use in parking lots and on private property.
NHTSA said that out of millions of Summon sessions, a fraction of one percent resulted in an incident.
Almost all involved minor property damage — impacts with parking gates, adjacent parked vehicles, and short parking bollards — with no reported injuries, fatalities, vulnerable road user involvement, airbag deployments, or vehicle tow-aways.
The investigation was not escalated to an engineering analysis or a formal recall.
Tesla addressed the identified issues through six over-the-air software updates deployed between January and November 2025.
What NHTSA Found
The agency’s analysis identified a common pattern: incidents typically occurred early in a Summon session, when the system or the person using the app failed to fully detect or respond to the vehicle’s surroundings.
Users often did not have a complete 360-degree view in the app to assess situational awareness, limiting their ability to determine whether a collision was imminent during initial manoeuvres such as reversing near an obstacle or a kerb.
NHTSA flagged two specific incidents involving camera blockages, in which vehicles attempted to navigate snowy parking lots with snow partially or fully obstructing the forward-facing cameras.
In both cases, the system did not detect the blockage and the vehicles collided with unoccupied parked cars.
App users in both instances did not command a stop despite the obstructed camera being visible in the app’s camera feed.
A separate incident involved a vehicle failing to yield for a gate arm blocking a garage exit lane, with the app user again not commanding a stop.
Tesla’s Software Fixes
Tesla deployed six OTA software updates to address the issues identified during the investigation.
On January 15, 2025, two updates improved camera blockage detection mechanisms.
On January 20 and January 30, two additional updates reduced false negative camera blockage detections caused by snow or condensation.
On February 6, a fifth update improved the vehicle’s reaction to dynamic gates by upgrading perception systems through what NHTSA described as a “high-fidelity occupancy determination network” that uses sensor data to improve reconstruction of objects in the vehicle’s field of view.
On November 20, 2025, a sixth update added object detections from a separate neural network.
All six updates were applied to vehicles in service and to production vehicles.
Scope
The investigation covered approximately 2,585,000 Tesla vehicles across the company’s full passenger vehicle lineup: 2016 to 2025 Model S and Model X, 2017 to 2025 Model 3, and 2020 to 2025 Model Y vehicles equipped with Full Self-Driving capability.
The opening document cited one vehicle owner questionnaire complaint and 15 media and other reported incidents, all involving low-speed impacts with posts or parked vehicles.
Standard Closure Language
NHTSA’s closing statement included its standard reservation of rights.
“The closing of this investigation does not constitute a finding that a safety-related defect does not exist,” NHTSA clarified. “The agency reserves the right to take additional action if warranted by future circumstances.”
The investigation is one of several NHTSA has conducted into Tesla‘s driver-assistance and automated driving features.
A separate, broader investigation into Tesla‘s Autopilot system and its involvement in crashes with emergency vehicles led to a recall of approximately two million vehicles in December 2023.
Broader FSD Scrutiny
The Smart Summon closure comes three weeks after an escalation of NHTSA’s scrutiny of Tesla‘s Full Self-Driving software on other fronts.
In mid-March, the agency opened an engineering analysis — a more advanced stage of investigation than a preliminary evaluation — into FSD’s ability to detect when the system is degrading or not performing properly.
That probe covers approximately 3.2 million vehicles produced from 2023 to 2025 and was triggered by crashes and near-misses where the system failed to recognise driver inattention.
The agency is also separately investigating FSD over reports of traffic-safety violations — including running red lights and ignoring speed limits — under Tesla‘s ‘Mad Max’ driving mode, after connecting more than 50 reports and several crashes to the software.
Tesla has requested additional time to submit data for that probe, citing the strain of responding to multiple concurrent NHTSA investigations.









