Tesla CEO Elon Musk said on Wednesday that vehicle logs show the driver of a Cybertruck involved in a violent overpass crash in Houston last August disengaged Autopilot four seconds before impact.
The statement directly challenges the framing of a $1 million lawsuit filed by the driver, Justine Saint Amour, whose attorney has argued that the crash was caused by Tesla‘s system failing to navigate a standard highway curve — not by anything the driver did wrong.
“Logs show driver disengaged Autopilot four seconds before crashing,” Musk wrote on X, in a reply on early Friday.
Saint Amour’s attorney, Bob Hilliard of Hilliard Law, has acknowledged that his client disengaged the driver-assistance feature before impact, but said she did so because the system was failing and “it was too late” to correct the vehicle’s trajectory.
The four-second window is now the central contested fact in the case.
At highway speeds, four seconds represents roughly 100 metres of travel — potentially enough time to brake but likely insufficient to steer a heavy vehicle away from a concrete barrier at the edge of an overpass.
What the Dashcam Shows
The incident occurred on August 18, 2025, on Houston’s 69 Eastex Freeway. Saint Amour was driving her Cybertruck with Autopilot engaged as the vehicle approached a Y-shaped overpass split near the 256 Eastex Park and Ride interchange.
Dashcam footage shows the Cybertruck should have followed the right-hand curve at the split.
Instead, the vehicle continued straight ahead, barrelled through traffic cones separating the lanes, and slammed into the concrete barrier at the edge of the overpass. Parts of the vehicle were seen flying off on impact.
Had the Cybertruck not struck a lighting pole that redirected it, it would likely have plunged more than 30 feet onto the freeway below.
Saint Amour’s infant was in the vehicle at the time of the crash. The child was not injured, according to attorney Hilliard.
Saint Amour suffered serious injuries to her right shoulder, neck, and back, including two herniated discs in her lower back, another in her neck, sprained tendons in her wrist, and neuropathy causing numbness, tingling, and weakness in her right hand, according to Hilliard.
The lawsuit filing itself does not detail specific injuries, nor does it specify the speed at impact or whether airbags deployed.
The Lawsuit’s Claims
The lawsuit, filed in Harris County District Court on February 23, seeks more than $1 million in damages and alleges negligence and gross negligence by Tesla.
The filing uses the term “autopilot” throughout but does not specify whether the active system at the time of the crash was basic Autopilot or the Full Self-Driving package that came with the vehicle — a distinction that may matter in court, as Tesla sells them as separate products with different capability levels.
According to the court filing, the Cybertruck “attempted to drive straight ahead into the concrete barrier and the freeway below, which caused Plaintiff to disengage the self-driving mode and take control of the wheel but it was too late and crashed into the barrier.”
The lawsuit also argues it is unrealistic to expect any driver to “transition from not driving to driving” to take control of the vehicle on short notice — a direct challenge to Tesla’s position that its systems require constant driver supervision.
“Tesla‘s decisions made Justine’s accident inevitable,” Hilliard said in a statement to FOX Business. “This company wants drivers to believe and trust their life on a lie: that the vehicle can self-drive and that it can do so safely. It can’t, and it doesn’t.”
The filing directly challenges Tesla‘s camera-only approach to autonomous driving, alleging that the company’s own engineers had recommended incorporating LiDAR.
“While engineers at Tesla recommended the super-human vision of LiDAR be included for self-driving vehicles, and competitors like Waymo and Cruise relied heavily on LiDAR, Musk chose instead to rely only upon cheap video cameras,” the lawsuit states.
It also alleges Tesla failed to activate automatic emergency braking and lacked an adequate driver alert system.
The filing argues that “the safer design alternatives were technologically and economically feasible” and would have “prevented or significantly reduced the risk of injury.”
The lawsuit names Musk personally, describing the CEO as “an aggressive and irresponsible salesman, who has a long history of making dangerous design choices, and over-promising the features of his products.”
The filing goes further, alleging Tesla was negligent in hiring and retaining Musk as CEO and “allowing him to participate in product design decisions.”
The court document lists six specific failures by Tesla: failing to provide effective automatic emergency braking, failing to incorporate LiDAR, failing to properly design warnings for the self-driving feature, failing to warn of its dangers and limitations, misleadingly advertising FSD as “self-driving,” and negligently retaining Musk in his role.
Growing Legal and Regulatory Pressure
The Houston case adds to a mounting wave of legal action against Tesla’s driver-assistance systems.
A federal judge recently upheld a $243 million verdict against Tesla in a separate Autopilot case involving the death of a 22-year-old woman.
The National Highway Traffic Safety Administration is investigating 2.88 million Tesla vehicles equipped with Full Self-Driving after connecting 58 incidents to the system.
In California, a judge ruled last December that Tesla‘s FSD marketing is “actually, unambiguously false and counterfactual.”
Tesla has since rebranded its systems in the state: “Full Self-Driving” became “Full Self-Driving (Supervised)” and “Autopilot” became “Traffic Aware Cruise Control.”
The lawsuit also accuses Tesla of using non-disclosure agreements to prevent drivers from sharing information about FSD’s performance — a practice NHTSA has said adversely impacts its ability to investigate safety issues.









