Advertisement

Tesla's self-driving tech under fire after fatal crashes

Tesla's Full Self-Driving (FSD) system has long been a hot topic of debate, but recent crash reports are putting the technology under even harsher scrutiny.

The National Highway Traffic Safety Administration (NHTSA) has launched an investigation into Tesla's FSD after four accidents, including one that resulted in a pedestrian’s death. With a potential recall of Tesla’s autonomous driving technology hanging in the balance, this probe could mark a turning point for the automaker’s ambitions.

Related: 2025 Chevrolet Equinox EV quick take — sub $35,000 with 300+ miles of range

NHTSA investigation targets 2.4 million vehicles

The NHTSA’s investigation is broad, encompassing an estimated 2.4 million Teslas. Models from 2016 to 2024 are under review, including the Model S, Model X, Model 3, Model Y and the Cybertruck. The incidents in question all occurred under reduced visibility conditions — sun glare, fog, and airborne dust. According to NHTSA, the crashes occurred when the FSD system failed to “react appropriately” to these environmental factors.

Tesla Model 3
Tesla Model 3

While NHTSA’s evaluation is still in the preliminary stages, it’s an essential first step that could lead to a recall. If the investigation concludes that Tesla’s FSD system poses an unreasonable safety risk, the automaker could be forced to update or modify millions of vehicles.

ADVERTISEMENT

Related: The most affordable EV in the US ($35,000) with 300+ miles of range is here

Shortcomings of camera-only autonomous tech

The crashes highlight a key weakness in Tesla’s approach to autonomous driving — its reliance on a camera-only system. Unlike some competitors, which use lidar or radar alongside cameras, Tesla has doubled down on visual-based technology. This decision has drawn criticism from industry experts, especially in cases where weather or environmental conditions impair visibility.

Jeff Schuster, vice president at GlobalData, pointed out that relying solely on cameras might be problematic. “Weather conditions can impact the camera's ability to see things and I think the regulatory environment will certainly weigh in on this,” Schuster said.

Tesla’s previous recalls and investigations

This isn’t Tesla’s first brush with safety concerns. Last December, Tesla issued a massive recall affecting over two million vehicles to address issues with its autopilot system. The company deployed a software update aimed at improving safety features, but the recall didn’t stop the growing wave of concern over its autonomous technology.

"We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems," the company said at the time. "At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury."

The NHTSA is also still probing whether the recall was sufficient, or if drivers continue to face risks when using Tesla's FSD. The Justice Department has been investigating Tesla’s FSD and Autopilot systems since 2022.

Related: Forget Tesla — these 7 EVs Will Dominate in 2025

What exactly the NHTSA is investigating

The NHTSA’s preliminary investigation will focus on evaluating whether Tesla’s FSD system can accurately "detect and respond" to situations with limited visibility, such as sun glare, fog, or dust.

The agency will also look into whether other crashes under similar conditions have occurred and what factors contributed to them. Additionally, the investigation will assess if Tesla has made any software updates that could impact how the FSD system performs in these low-visibility scenarios.

A big roadblock for Tesla’s new Cybercab

Despite the mounting challenges, Elon Musk and Tesla remain committed to the dream of fully autonomous driving. Musk recently unveiled the Cybercab, a two-seater robotaxi concept designed without a steering wheel or pedals. In his vision, the Cybercab will rely solely on cameras and artificial intelligence to navigate roads.

But it’s not just about technology — Tesla also faces regulatory hurdles. The Cybercab would need approval from the NHTSA before it could hit the streets. Given the ongoing FSD investigation, that approval could be a long way off.

Tesla's competitors, including companies like Waymo, have already deployed robotaxis in limited capacities. Unlike Tesla, these companies use a combination of cameras, lidar, and radar to create a multi-layered system of safety redundancies. The question now is whether Tesla’s minimalist approach will hold up in an increasingly competitive market.

Related: Tesla Robotaxis exposed — separating hype from reality

Full Self-Driving that isn’t fully self-driving

A significant point in Tesla’s defense is that, despite the name, Full Self-Driving is not truly autonomous. According to Tesla’s website, FSD features require “active driver supervision” and “do not make the vehicle autonomous.” This stipulation means drivers should remain vigilant, even when FSD is engaged.

However, critics argue that the branding of “Full Self-Driving” creates confusion, leading drivers to overestimate the system’s capabilities.

Final thoughts

Tesla’s Full Self-Driving feature is one of the most advanced driver assistance systems available, but recent events have cast doubt on its safety in real-world conditions. The NHTSA’s investigation into 2.4 million Tesla vehicles could lead to significant changes, not only for Tesla but for the entire industry. As autonomous technology becomes more common, the line between innovation and risk is more important than ever.

For now, Tesla owners should remain cautious and remember that, despite the name, Full Self-Driving still requires a watchful human behind the wheel.

Related: Fiat 500e flops — is Stellantis throwing away $110 million?