Tesla is not just the best-selling EV brand in North America—it also offers one of the most advanced autonomous technologies in the auto industry. Every Tesla vehicle sold comes with its Autopilot feature that makes it possible to self-drive safely with the driver's oversight.

However, Tesla's Autopilot has been investigated numerous times for causing accidents. Do the allegations hold any water? And would it be possible for a Tesla to crash on Autopilot? Let's dig deeper into the investigations.

Can a Tesla Crash on Autopilot?

Black tesla steering wheel next to a GPS navigation screen

According to NHTSA, 439 Tesla vehicles have been involved in accidents since July 2021 while the Autopilot was engaged. In fact, compared to other automakers, Tesla has the highest number of reported crashes while driving on autonomous technology—but this could be because Tesla sells more self-driving vehicles than any other car company in North America.

As a result, Tesla's Autopilot has been probed on different occasions after serious accidents. For instance, in 2018, a Tesla Model X crashed while the driver was using the Autopilot—according to an NTSB investigations report.

Similarly, another fatal accident on May 12, 2022, involving a Tesla Model S, is also suspected of having been caused by the Autopilot (via Reuters). The investigations keep piling up, and NHTSA has opened at least 35 cases involving Tesla's Autopilot.

Beyond that, many Tesla drivers using Autopilot have complained of phantom braking—some of them are even suing Tesla. However, crashes involving phantom braking are yet to be reported.

Who Is at Fault if a Tesla Crashes on Autopilot?

Tesla Model S crash

Tesla's Autopilot is a level 2 autonomous technology. More succinctly, it can automatically steer and engage the brakes, but the driver needs to intervene when the situation calls for it. This means that if you're involved in a car crash, the driver is liable under the law even if Tesla's Autopilot was active.

Case in point? The National Public Radio reported that a Tesla driver was indicted for manslaughter after causing an accident while on Autopilot and killing two people. In addition, while investigating a fatal crash that occurred with Tesla's Autopilot driving the vehicle, the National Transportation Safety Board clarified that the driver was at fault since he was distracted before the accident.

NTSB also recommended that automakers design systems that monitor drivers while using driver assistant technology—if the driver is distracted, the system should trigger a warning.

After NTSB's recommendations, Tesla updated its software to activate in-car monitoring to detect drivers who are inattentive on the road while relying on the Autopilot. Tesla's monitoring system uses cameras to monitor the driver and improve safety.

According to NTSB, there is not a single car with level 5 or level 6 driver automation technology that doesn't require the driver to intervene. Even Tesla's Full Self-Driving beta version is a level 2 autonomous technology that requires the driver's input.

Tesla Autopilot Makes It Safer to Drive

Tesla Autopilot

Even though accidents can happen when drivers use Tesla's Autopilot, the driver assistant software makes driving safer. According to NHTSA, 94% of car accidents in the United States are caused by human error.

Besides that, the Tesla Vehicle Safety Report discloses Tesla's car accident data every year, including crashes that occurred when the drivers were using Autopilot technology.

In its latest Q4 2021 data, Tesla reported that one crash occurred after every 1.59 million miles when Autopilot was inactive. However, when the Autopilot was engaged, only one car accident happened after every 4.31 million miles.

Improve Safety by Staying Alert When Autopilot Is Engaged

Until level 5 self-driving becomes a reality, you should pay attention on the road, even if you're using Autopilot. If Tesla's Autopilot is slow to respond to a potential hazard, you can take control of the steering wheel manually—or if it disengages, you can swiftly take over.

In a nutshell, Tesla's Autopilot is not perfect, but it's safer to drive using it as long as you're paying attention on the road.