Today : Oct 16, 2025
Technology
12 October 2025

Tesla Faces Federal Probe Over Self-Driving Failures

A sweeping NHTSA investigation into nearly 2.9 million Teslas raises new questions about the safety of Full Self-Driving technology and the future of autonomous vehicles.

On October 12, 2025, the National Highway Traffic Safety Administration (NHTSA) fired a shot across the bow of the autonomous vehicle industry, launching a formal investigation into Tesla’s Full Self-Driving (FSD) mode. The probe, which targets nearly 2.9 million Tesla vehicles equipped with the FSD system, comes after a troubling series of reports: cars running red lights, veering into oncoming traffic, illegal lane changes, and even driving on the wrong side of the road. The implications for Tesla—and for the future of self-driving tech—are nothing short of seismic.

According to filings detailed by Digital Trends and other outlets, the NHTSA’s investigation is rooted in 58 incident reports, some of which involved crashes that endangered both drivers and pedestrians. These are not just minor hiccups or harmless software bugs. As AP News reported, there have been instances where Teslas in FSD mode ignored traffic signals and crashed into other vehicles, sometimes causing injuries. The sheer scale of the probe—covering essentially all Teslas with FSD from 2016 onward—underscores just how high the stakes are.

This isn’t Tesla’s first dance with federal regulators. Previous investigations have scrutinized the company’s Autopilot feature, especially after fatal crashes. But this time, the spotlight is squarely on FSD and its alleged tendency to commit basic traffic violations. The NHTSA’s focus, according to Reuters, is not only on the incidents themselves but also on whether Tesla’s software updates have done enough to address these potentially deadly flaws. If systemic issues persist, it could spell trouble for the company’s ambitious roadmap.

Tesla CEO Elon Musk has long positioned FSD as the company’s crown jewel—a transformative technology that could enable a future of robotaxi fleets and fundamentally reshape the automotive landscape. As PBS News points out, Musk’s vision relies heavily on the notion that FSD will soon reach a point where cars can drive themselves safely, without human intervention. Yet, the probe’s findings could throw a wrench into those plans. If regulators determine that FSD isn’t ready for prime time, Tesla may be forced to issue recalls or roll out mandatory software fixes, potentially delaying the much-hyped robotaxi rollout.

Critics have been quick to pounce on Tesla’s approach to self-driving tech. The company’s strategy of beta-testing FSD in real-world conditions, using data from customer vehicles, has always been controversial. Safety advocates argue that this exposes the public to unnecessary risks. As Reuters highlighted, the NHTSA’s data includes over 50 reports of violations like illegal lane changes and failure to yield—hardly the stuff of minor glitches. Instead, these incidents suggest a pattern of erratic behavior that has alarmed industry watchers and prompted calls for stricter oversight of autonomous vehicles.

Public reaction to the probe has been swift and divided. Some Tesla owners remain staunchly supportive, convinced that the company’s over-the-air updates will eventually iron out the kinks. Others, especially those who have witnessed or experienced FSD mishaps firsthand, are demanding greater transparency and accountability. Social media channels are abuzz with heated debates, with some users sharing dashcam footage of near-misses and others defending the technology’s long-term promise.

Meanwhile, the broader autonomous vehicle sector is watching closely. Competitors like Waymo and Cruise have faced their own regulatory headaches, but Tesla’s case is unique in scale and visibility. The company’s aggressive marketing of FSD as “full self-driving”—despite clear disclaimers that human supervision is still required—has only added fuel to the fire. As The Washington Post noted, reports of Teslas entering opposing lanes have sparked fresh questions about whether any current AI model can truly handle the chaos of urban driving.

So what’s next for Tesla? The company has yet to publicly comment on the specifics of the NHTSA probe, but its past responses offer some clues. Tesla has consistently emphasized its commitment to safety and ongoing improvements, often delivered via over-the-air software updates. Insiders suggest that the company may argue these incidents are outliers—edge cases in a rapidly evolving technology. But with mounting evidence from sources like CBS News, which cites dangerous signal violations and erratic driving, that argument may be a tough sell.

The investigation’s outcome could have far-reaching consequences. If the NHTSA concludes that FSD poses a systemic risk, Tesla could be required to scale back features, invest billions more in software refinements, or even face class-action lawsuits from affected drivers. For potential buyers, the probe raises uncomfortable questions about the reliability and safety of one of Tesla’s marquee offerings. For current owners, there’s a growing sense of uncertainty—will their cars be subject to recalls, or will new updates be enough to restore confidence?

Broader industry ramifications are already coming into focus. The probe is likely to embolden critics who argue that the autonomous vehicle sector has been moving too fast, prioritizing innovation over public safety. It may also prompt regulators worldwide to take a harder look at self-driving technologies, especially as more automakers race to deploy their own versions of advanced driver assistance systems. In a sector where perception is everything, the fallout from this investigation could reverberate for years to come.

For Tesla, the timing couldn’t be worse. With plans for a robotaxi fleet on the horizon and the company betting big on autonomous tech to drive future growth, any regulatory setback could delay or derail those ambitions. As industry analysts have observed, the path to truly self-driving cars is proving to be far more complex—and fraught—than many had hoped.

At the heart of the debate is a simple question: Can current AI-driven systems be trusted to navigate the unpredictable realities of the road? The NHTSA’s investigation may not yield a definitive answer, but it’s certain to shape the next chapter in the ongoing saga of autonomous vehicles. For now, one thing is clear—the journey toward full autonomy is anything but smooth, and every red light run by a self-driving car is a reminder of just how much work remains.