Today : Dec 07, 2025
Technology
07 December 2025

Waymo Issues Software Recall After School Bus Incidents

Reports of autonomous vehicles passing stopped school buses prompt federal investigation and renewed scrutiny of safety standards for self-driving cars.

Waymo, the autonomous vehicle company under Alphabet’s umbrella, is once again in the spotlight after a string of incidents involving its self-driving robotaxis and stopped school buses. The company announced on Friday, December 5, 2025, that it would voluntarily recall the software powering its vehicles following reports from Texas and Georgia that some of its cars illegally passed school buses while children were boarding or disembarking. While no injuries have been reported, the episodes have reignited a national debate about the safety and oversight of autonomous vehicles on public roads.

According to NPR, the National Highway Traffic Safety Administration (NHTSA) opened an investigation in October 2025 after media reports and video evidence surfaced showing Waymo vehicles failing to remain stopped for school buses with flashing red lights and deployed stop arms. The most widely circulated video, aired by WXIA-TV in Atlanta in September, captured a Waymo robotaxi driving around a stopped school bus as children were getting off. In Austin, the Independent School District documented 19 separate instances during the 2025-2026 school year where Waymo cars illegally and dangerously passed stopped school buses. In one particularly alarming incident, a vehicle passed a bus just moments after a student crossed in front of it, while the child was still in the road.

School leaders in Austin responded by asking Waymo to pause all operations near schools during pick-up and drop-off times. Their concerns only grew after five of the 19 incidents reportedly occurred after Waymo told the district it had updated its software to resolve the issue. As reported by Fox Business, Austin school officials specifically pointed to one event where “a Waymo automated vehicle was recorded driving past a stopped school bus only moments after a student crossed in front of the vehicle, and while the student was still in the road.”

Waymo’s response has been swift, if not entirely reassuring to some critics. In a statement provided to multiple news outlets, Waymo’s Chief Safety Officer Mauricio Peña said, “While we are incredibly proud of our strong safety record, showing Waymo experiences twelve times fewer injury crashes involving pedestrians than human drivers, holding the highest safety standards means recognizing when our behavior should be better. As a result, we have made the decision to file a voluntary software recall with NHTSA related to appropriately slowing and stopping in these scenarios. We will continue analyzing our vehicles’ performance and making necessary fixes as part of our commitment to continuous improvement.”

Waymo clarified that this recall is not a traditional one that would pull vehicles from the road. Instead, it involves a software update intended to correct the vehicles’ behavior around stopped school buses. According to ABC7 News, all affected vehicles had already received the update by November 17, 2025. Yet, the fact that incidents continued after previous updates has raised questions about the effectiveness and transparency of these fixes. San Jose State professor and tech analyst Ahmed Banafa commented, “The issue of autonomous cars is trust, why should I take this car and this car has this history? Now the software they have to do it because again we're talking about near miss.” He added, “Are they open for a third party to check that for them? That's my biggest question, just to make sure that they gain the trust—this is part of the transparency and accountability.”

The NHTSA, for its part, is demanding answers. The agency sent Waymo a detailed list of questions about the incidents on December 1, 2025, and gave the company until January 20, 2026, to respond. According to NPR, the agency’s Office of Defects Investigation is now conducting a preliminary evaluation to “investigate the performance of the Waymo [automated driving system] around stopped school buses and the system’s ability to follow traffic safety laws concerning school buses.” Given that Waymo’s autonomous fleet surpassed 100 million miles of driving last July and continues to rack up 2 million miles each week, NHTSA officials noted “the likelihood of other prior similar incidents is high.”

This is not the first time Waymo has faced scrutiny over its robotaxi fleet. Earlier in 2025, the company issued another software recall after some vehicles hit gates, chains, and other stationary objects. In 2024, Waymo filed two additional recalls: one after a fleet vehicle crashed into a telephone pole, and another to correct how two separate robotaxis hit the same pickup truck being towed. Each time, Waymo has emphasized its commitment to safety and continuous improvement, pointing out that independent analyses from technology outlets like Ars Technica and the newsletter Understanding AI support its claims that its vehicles are safer than human drivers.

Indeed, Waymo asserts that its cars have a better safety record than humans, with 91% fewer crashes resulting in serious injuries and 92% fewer crashes involving pedestrian injuries in the cities where it operates. Passengers, too, have weighed in with generally positive—if cautious—feedback. As Waymo rider Ethan Frommer told ABC7 News, “I think it's driving like how a lot of humans do, I think it's driving the way it should to kind of interact with other human drivers basically and for the most part it's safe, I mean there's obviously still improvements they can make it's like they're 99% there but there's still things to tweak but I think that's kind of part of the process.”

Waymo’s reach continues to expand despite these setbacks. Since November 12, 2025, passengers have been able to use Waymo’s robotaxis for pick-up and drop-off at San Jose Mineta International Airport, making it the first commercial airport in California to allow such a service. The company, a subsidiary of Google’s parent Alphabet, has made safety a cornerstone of its public messaging. But as the recent school bus incidents show, even the most advanced technology can stumble in scenarios that demand split-second, ethical decision-making—something human drivers, for all their flaws, are still uniquely equipped to handle.

As federal regulators, school officials, and the public await the results of NHTSA’s investigation, the debate over how to balance innovation with public safety is only intensifying. Waymo’s willingness to acknowledge shortcomings and issue voluntary recalls is a step in the right direction, but for many, the question remains: Can trust in autonomous vehicles be fully restored when the stakes—children’s safety—are so high?

With the spotlight on both technological progress and accountability, the coming months will test whether Waymo and its competitors can meet the high bar set not just by regulators, but by the communities they serve.