Today : Oct 12, 2025
Technology
12 October 2025

Tesla Faces Federal Probe After Self-Driving Crashes

U.S. regulators investigate dozens of incidents involving Tesla’s Full Self-Driving software as critics question the safety and marketing of the technology.

Federal regulators have turned a sharp spotlight on Tesla’s self-driving technology after a series of alarming incidents involving the company’s vehicles. On October 11, 2025, the National Highway Traffic Safety Administration (NHTSA) confirmed it had launched a sweeping investigation into Tesla’s Full Self-Driving (FSD) feature. This probe follows 58 reported incidents where Teslas ran red lights or veered into the wrong lane—sometimes straight into oncoming traffic—leading to more than a dozen crashes, fires, and nearly two dozen injuries, according to the Associated Press.

The scope of the investigation is vast. NHTSA’s inquiry covers about 2.9 million Teslas equipped with FSD or FSD Beta technology, essentially encompassing nearly every Tesla on American roads with the advanced driver-assist system. The regulator’s focus zeroes in on two particularly dangerous behaviors: vehicles failing to stop at red lights and improper lane changes that put cars into opposing traffic. In one Maryland town, Joppa, several incidents reportedly occurred at the same intersection, prompting Tesla to issue a targeted software update, as reported by electrek. Yet, it remains unclear whether the company notified NHTSA about the recurring problem at that location.

Social media has played a role in amplifying public concern. Videos circulating online show Teslas with FSD engaged drifting across solid lane lines and into the path of oncoming vehicles. In some cases, the cars appeared to correct themselves at the last moment, but the footage has left many viewers unnerved about the reliability of the technology. According to NHTSA’s filing, many drivers involved in these incidents said their vehicles gave no warning before suddenly behaving unpredictably, leaving them little time to intervene and prevent a violation—or worse, a crash.

For Tesla, the timing couldn’t be worse. The company, led by Elon Musk, has long staked its future on the promise of autonomous driving. Musk has made bold predictions, repeatedly stating that full autonomy was just months away, and even promising to roll out hundreds of thousands of driverless taxis in U.S. cities by the end of next year. Yet, these timelines have never materialized, and the latest investigation adds to a growing list of regulatory and legal headaches. As noted by the Associated Press, Tesla is already facing several other open investigations by NHTSA, including probes into the company’s Autopilot and “summon” features, as well as scrutiny over its crash reporting practices.

What makes Tesla’s approach so controversial is its real-world testing philosophy. While competitors like Waymo and Cruise confine their experiments to controlled environments and professional safety drivers, Tesla has opted to use public roads as its laboratory. Every customer with FSD becomes part of an ongoing experiment, encountering situations the software may never have faced before. Safety researchers have warned for years that deploying unproven artificial intelligence systems in the wild puts not just Tesla drivers, but also other motorists, pedestrians, and cyclists at risk—especially in so-called “edge cases” where the AI hasn’t been trained to respond appropriately.

“The world has become a giant testing ground for Elon’s concept of full self-driving, and it’s not working,” said Ross Gerber, a money manager and long-time Tesla investor, to the Associated Press. Gerber, once a big believer in Tesla’s driver assistance features, now argues the company should stop calling its system ‘full self-driving’ and supplement its vision-only approach with radar sensors and other hardware. “They have to take responsibility for the fact that the software doesn’t work right and either adjust the hardware accordingly — and Elon can just deal with his ego issues — or somebody is gonna have to come in and say, ‘Hey, you keep causing accidents with this stuff and maybe you should just put it on test tracks until it works,’” Gerber added.

At the heart of the controversy is the disconnect between the branding and the reality of Tesla’s technology. Despite its name, Full Self-Driving is classified as Level 2 driver-assistance software, meaning drivers must remain attentive and ready to take over at any moment. Safety advocates have long criticized Tesla for overselling the system’s capabilities, warning that the marketing may lull drivers into a false sense of security. NHTSA’s investigation will examine whether drivers had adequate warning to intervene before a violation or crash occurred, and whether the system’s design creates broader risks for everyone on the road.

The legal and regulatory pressures are mounting. In August 2025, a Miami jury found Tesla partly responsible for a deadly 2019 crash involving its Autopilot technology—a separate but related driver-assist system—and ordered the company to pay over $240 million in damages. Tesla has said it will appeal the decision. Meanwhile, the California Department of Motor Vehicles is investigating whether Tesla’s marketing of Full Self-Driving has been deceptive, with a judge expected to decide the case in the coming months. The company has also settled two more wrongful death lawsuits related to its driver-assistance systems in the past year, according to electrek.

For its part, Tesla has maintained in regulatory filings and court cases that it repeatedly tells drivers the system cannot drive the car by itself, and that whoever is behind the wheel must be ready to intervene at all times. The company points to data it claims shows FSD users have fewer accidents than average drivers, though independent researchers note that these figures haven’t been verified through rigorous scientific study.

To address some of the scrutiny, Tesla recently introduced a new version of FSD and is testing an upgraded model that would not require driver intervention at all—another step toward the elusive goal of true autonomy. But the company is under increasing pressure to show real progress. Its core business of selling cars is struggling, with sales hit by customer boycotts over Musk’s political stances and rising competition from rivals like China’s BYD, which offers cheaper, high-quality electric vehicles. In response, Musk announced on October 7, 2025, that Tesla would be selling two new, less expensive versions of existing models, including the best-selling Model Y. Investors, however, were unimpressed, sending Tesla’s stock down 4.5% after the announcement.

The NHTSA’s investigation remains active, with findings expected after a thorough analysis of the incidents and Tesla’s data. The outcome could have far-reaching implications not only for Tesla but for the future of autonomous driving in America. As the debate over technology, marketing, and public safety continues, the central question lingers: Is Tesla’s self-driving software ready for the real world, or is it putting everyone on the road at risk?

As regulators dig deeper, the answer will shape the next chapter of both Tesla and the broader autonomous vehicle industry.