The National Highway Traffic Safety Administration (NHTSA) said on Tuesday that Waymo is recalling 3,791 robotaxis in the United States over a self-driving software defect that could cause the vehicles to drive onto flooded roads.
The recall covers certain fifth- and sixth-generation Automated Driving Systems (ADS) deployed across the Alphabet subsidiary’s commercial fleet.
According to the NHTSA, the software may fail to adequately detect or respond to flooded road conditions — creating a potential safety risk for passengers and other road users.
Waymo is currently developing a full software remedy, the agency said.
As an interim measure, the Alphabet subsidiary has tightened its operational parameters by increasing weather-related constraints and updating its map data to restrict vehicle access to roads that may be affected by flooding.
Waymos on Flooded Roads
According to the NHTSA recall notice, on April 20 an unoccupied Waymo vehicle “encountered an untraversable flooded section of a roadway” with a 40 mph speed limit.
The vehicle detected the floodwater but continued forward at reduced speed rather than stopping or rerouting.
Following the incident, Waymo implemented additional operational restrictions “at times and in locations where there is an elevated risk of encountering a flooded, higher-speed roadway,” the filing said.
Four days later, Waymo‘s Safety Board reviewed the issue and decided to conduct a recall.
The April 20 incident does not appear to be an isolated case.
Users on social media have shared images and videos of Waymo robotaxis entering flooded areas or becoming trapped in standing water.
The incidents include at least one situation in Phoenix, where the vehicle entered a flooded road, ignoring a sign that said “do not enter when flooded,” and then went into reverse.
Recall Covers Nearly Entire Fleet
While Waymo does not routinely disclose its exact fleet size, available data suggests the recall could affect nearly the company’s entire operational fleet.
The Robotaxi Tracker platform shows that, as of December 11, 2025, the company had 3,067 official vehicles in its fleet, including both active and inactive units.
TechCrunch also referenced a “steady 3,000-fleet figure” as recently as late March.
The fact that the recall covers 3,791 ADS units — more than the roughly 3,000 commercial vehicles believed to be on the road — suggests the recall extends beyond the active ride-hailing fleet to include vehicles in testing, integration, or storage that are equipped with the affected fifth- and sixth-generation software.
Fourth Recall in Less Than Two Years
The flooded-road recall is the fourth software recall Waymo has issued since February 2024.
Late last year, the company recalled 3,067 robotaxis equipped with its fifth-generation ADS after its vehicles were repeatedly caught on camera illegally passing stopped school buses in Texas and Georgia.
Waymo said at the time that it had identified a software issue that contributed to the incidents and deployed an update across its fleet by November 2025 — a fix that did not fully resolve the problem, however.
As of April, the Austin Independent School District had documented at least 24 instances of Waymo vehicles illegally passing its school buses during the 2025–2026 school year, including incidents that occurred after the software update was deployed.
Prior to that, Waymo recalled 1,212 robotaxis in May 2025 over collisions with stationary roadway barriers such as chains and gates.
The NHTSA had opened a preliminary evaluation into those collisions in May 2024, citing at least seven incidents between December 2022 and April 2024.
The company also issued its first ADS software recall in February 2024, after two of its robotaxis struck the same improperly towed pickup truck in Phoenix.
Mounting Federal Scrutiny
The latest recall adds to a growing list of federal investigations and safety actions involving Waymo‘s autonomous vehicle technology.
The NHTSA has an open investigation into the company after one of its robotaxis struck a child near an elementary school in Santa Monica, California, on January 23.
The child was running across the street toward the school during morning drop-off when the Waymo vehicle made contact.
Waymo said at the time that the vehicle braked hard, reducing its speed from approximately 17 mph to under 6 mph before contact was made.
The company argued that its peer-reviewed safety model showed a human driver in the same scenario would have struck the child at approximately 14 mph.
Separately, the National Transportation Safety Board (NTSB) said last month that it was investigating an incident in January in which Waymo self-driving vehicles passed a stopped school bus with its lights activated in violation of Texas state law.
The NTSB has since expanded the scope of that investigation to include a March 25 incident involving another Austin ISD school bus.
The Austin school district has asked Waymo to cease operations during morning and afternoon hours when students are loading and unloading from school buses.
Scaling Amid Safety Questions
The steady pace of recalls and investigations contrasts with Waymo‘s rapid commercial expansion across the United States.
Waymo co-CEO Dmitri Dolgov said in March that the company had reached 500,000 weekly paid rides and was accumulating over four million autonomous miles per week.
The Alphabet subsidiary has also signaled ambitions to expand internationally.
Waymo has been testing vehicles in Tokyo since late 2025 and executives said in March that a commercial launch in the Japanese capital could be ready within months.
The company has also confirmed plans to deploy in London in 2026.
However, the growing number of safety actions raises questions about the readiness of autonomous driving software to operate in the full range of real-world conditions — from adverse weather and flooded roads to school zones and complex urban environments — that human drivers navigate daily.
Waymo has about 35 remote operators based in the Philippines who assist its robotaxi fleet, helping vehicles navigate situations that its ADS cannot handle autonomously.
The company has maintained that its vehicles are significantly safer than human drivers, citing internal data showing a 91% reduction in serious-injury-or-worse crashes compared to the human baseline.
Waymo has also argued that it files voluntary recalls proactively as part of its safety framework, even after fixes have already been deployed across the fleet.
Co-CEO Tekedra Mawakana has called for federal regulators to establish national standards requiring autonomous vehicle developers to demonstrate safety before deploying on public roads.





