Today : Oct 11, 2025
Technology
11 October 2025

Tesla Faces Federal Probe Over Self-Driving Software Failures

A surge in complaints and crashes prompts U.S. safety regulators to investigate Tesla’s Full Self-Driving system and its impact on millions of vehicles.

The National Highway Traffic Safety Administration (NHTSA) has put Tesla under the microscope once again, launching a sweeping federal investigation into the company’s Full Self-Driving (FSD) software after a mounting wave of complaints, crashes, and injuries. The probe, announced on October 7 and expanded on October 9, 2025, covers a staggering 2.88 million Tesla vehicles equipped with the controversial FSD system. The move comes as reports pile up from drivers and media outlets alleging that Tesla’s software has repeatedly violated basic traffic laws—sometimes with dangerous, even injurious, results.

According to The Hill, NHTSA’s investigation zeroed in on at least 18 complaints and one media report where Tesla vehicles running FSD failed to stop, or failed to fully stop, at red lights or didn’t correctly detect traffic signals. In several of these instances, the cars reportedly barreled through intersections against red lights, sometimes leading to collisions. In total, NHTSA has now logged 58 reports of traffic-safety violations involving FSD, including 14 crashes and 23 injuries, as detailed by Reuters and Fox Business.

The complaints aren’t limited to running reds. NHTSA cited incidents where the FSD system allegedly caused Teslas to enter opposing lanes, cross double-yellow lines, or attempt to turn onto roads in the wrong direction. In at least some cases, vehicles went straight through intersections from turn-only lanes or made turns from through lanes—confusing maneuvers that would make any driving instructor wince. The agency is also investigating whether the system provided enough warning or time for drivers to intervene before these unexpected and sometimes hazardous moves occurred.

One particularly alarming complaint, reported by a Houston driver in 2024, summed up the frustration: “FSD is not recognizing traffic signals. This results in the vehicle proceeding through red lights and stopping at green lights. Tesla doesn’t want to fix it, or even acknowledge the problem, even though they’ve done a test drive with me and seen the issue with their own eyes.” NHTSA has taken such reports seriously, noting that several incidents “appeared to involve FSD executing a lane change into an opposing lane of travel with little notice to a driver or opportunity to intervene.”

The probe isn’t Tesla’s first brush with federal scrutiny. NHTSA has been looking at Tesla’s advanced driver assistance systems for over a year, including Autopilot and other features. In October 2024, the agency began an inquiry into 2.4 million Teslas after four reported collisions in conditions of reduced roadway visibility, such as sun glare, fog, or airborne dust—including a fatal crash in 2023. In January 2025, NHTSA opened a separate investigation into 2.6 million Teslas over reports of crashes involving a remote car-moving feature. Now, with the FSD probe, the agency’s focus is squarely on the software that Tesla has billed as a leap toward fully autonomous vehicles, though the company itself maintains that FSD is only partially autonomous and requires constant driver supervision.

For its part, Tesla describes FSD as a system that “will drive you almost anywhere with your active supervision, requiring minimal intervention.” Yet, as NHTSA and critics point out, the distinction between “assistance” and “automation” has grown increasingly blurry. Oliver Carsten, a professor of transport safety at the University of Leeds, commented to Reuters, “The NHTSA investigation into Tesla should serve as a wake-up call for Europe. We are seeing an increasing number of systems on the market that blur the line between assistance and automation.”

The stakes for Tesla are high. The NHTSA’s current investigation is a preliminary evaluation, the first step before the agency can seek a recall of vehicles if it determines there’s an unreasonable safety risk. If a recall is ordered, it could force Tesla to update or even disable features in millions of cars—a massive blow for a company that has staked much of its brand identity on technological innovation and autonomous driving. The probe has already rattled investors: Tesla shares fell 2.1% in early trading after Reuters broke the news of the investigation, and Fox Business reported a 5.06% drop following the official announcement.

Amid the federal scrutiny, Tesla quietly issued a software update to FSD in early October 2025, though the company did not immediately respond to requests for comment from major outlets including FOX Business. The timing of the update has only fueled speculation that the company is scrambling to address at least some of the issues flagged by regulators and drivers.

The political pressure is mounting as well. Last month, Democratic Senators Ed Markey and Richard Blumenthal publicly urged NHTSA to investigate FSD after a surge in near-collision reports. Their concerns included how FSD behaves at railroad crossings—another area now under review by the agency. With a new NHTSA administrator recently confirmed, Washington’s appetite for oversight appears to be growing, and Tesla’s every move is being watched closely.

Meanwhile, Tesla CEO Elon Musk—never shy about making headlines—has been shifting the company’s vision toward robotaxis, rather than self-driving software for private cars. Over the summer, Tesla began rolling out its much-anticipated robotaxi service in Austin, Texas, a pilot program that Musk hopes will showcase the company’s latest FSD capabilities. But even this initiative has not escaped NHTSA’s attention; the agency is reviewing Tesla’s deployment of self-driving robotaxis in Austin as part of its broader investigation.

For drivers, the core dilemma remains: Tesla’s FSD is marketed as the future of driving, yet it’s still classified as a Level 2 (partially autonomous) system, meaning drivers must remain attentive and ready to take control at any moment. NHTSA has repeatedly stressed that “drivers are fully responsible at all times for driving the vehicle, including complying with applicable traffic laws.” But as the complaints and crash reports suggest, the technology’s promise and real-world performance may not always align.

As the investigation unfolds, the outcome could have ripple effects far beyond Tesla. Regulators, automakers, and consumers worldwide are watching to see how the U.S. handles the challenges—and risks—of partially autonomous vehicles. Whether Tesla’s FSD can live up to its bold ambitions, or whether federal oversight will force a course correction, remains to be seen. But for now, the road ahead looks anything but smooth for the electric car pioneer.