briefs / Elon Musk, self-driving,

Crashes caused by Tesla Autopilot are piling up, and there are consequences

Will current investigations make automated driving systems safer or slow down the dream of self-driving cars?

Crashes caused by Tesla Autopilot are piling up, and there are consequences

Image credit: Laguna Beach Police Department via Forbes


  • Tesla’s advanced driving assistance system, AutoPilot, is under multiple investigations over recent crashes.
  • Moreover, the US recorded its first ever charging of an individual involved in a fatal crash while engaged with a driving assistance system.
  • Across academia, industry, and the press, many stress the need for these investigations and how Tesla’s claims about the Autopilot system are overhyped.
  • While it is important to investigate these cases, their findings should be used to improve autonomous driving instead of hampering future development of self-driving cars.

What Happened

For the first time in history, the US has charged an individual involved in a fatal crash when engaging with a partially automated driving assistance system. The driver of the Tesla was charged for killing two people because of the fatal crash that happened almost 2 years ago in Los Angeles.

The crash happened like this: the Tesla was moving at a high speed, left a freeway, ran a red light, and struck a Honda Civic at an intersection. Two people in the Honda Civic died at the scene, and the two passengers in the Tesla suffered non-life-threatening injuries. While the investigation does not mention anything about Autopilot, the NHTSA later confirmed that the Autopilot was engaged in the Tesla during the time leading to the crash.

This news does not completely come as a shock considering the ongoing investigation by the National Highway Traffic Safety Administration (NHTSA) into Tesla’s Autopilot system following 12 crashes that have left 17 people injured and one dead since 2014. According to the documents filed for the already ongoing investigation into Tesla’s Autopilot, the inquiry is focused on 765,000 Teslas — models Y, X, S and 3 — produced from 2014 to 2021.

We combed through all 12 crashes and summarized them in the table below. Each row is a crash, and the link on the date points to a news article on that incident.

Date Place Time Vehicle Hit or Ran Into Flashing Lights Bright Objects
01/22/2018 Culver City, CA 11AM Parked Fire truck Y -
05/20/2018 Laguna Beach, CA 11AM Police SUV Y -
12/07/2019 Norwalk, CT 4AM Parked Cruiser + Disabled car (Chain Collision) Y Flares
12/29/2019 Cloverdale, IN 8AM Parked Fire truck (earlier crash scene) Y -
01/22/2020 West Bridgewater, MA 10PM State Trooper SUV + Car (Chain Collision) Y Illuminated Arrow Board
07/30/2020 Cochise County, AZ 4AM Highway Patrol + Ambulance (Chain Collision) Y -
08/26/2020 Spring Hope, NC 12AM Deputy’s Cruiser + State Trooper Vehicle (Chain Collision) Y -
02/27/2021 Montgomery County, TX 1AM Highway Patrol Car Y -
03/17/2021 Lansing, MI 1AM State Trooper Vehicle Y -
05/19/2021 Miami, FL 5AM Florida Department of Transport Truck Y Cones
07/10/2021 San Diego, CA 3AM California Highway Patrol Cruiser Y -
08/28/2021 Orlando, FL 5AM Florida Highway Patrol Vehicle Y Cones

”-“ means no information found

Al the 12 crashes being investigated involved one or more emergency vehicles parked on highway lanes either attending to a previous crash scene or conducting regular monitoring operations. All had their emergency lights flashing at the time of the crash. Most of the crashes (nine of the twelve) also occurred in late night or early morning with dark lighting conditions. Some incidents also report additional emergency and closure signs like flares, road cones and illuminated arrow boards at the time of the crash. We refer readers to this link for a more detailed reading about the 12 attacks. Below are some tweets concerning these accidents:

The Reactions

To better understand the reactions of the various press outlets and experts to these crashes, it is important to understand what Autopilot actually does. Tesla’s Autopilot system has two main features: Traffic-Aware Cruise Control and Autosteer. The former helps the car accelerate/decelerate and maintain speed, while the latter helps the car stay in lane using data-driven learning algorithms. This is still considered level 2 autonomy, also called partial automated driving assistance. We refer the readers to The 6 Levels of Autonomy Explained to understand the various levels of autonomy. More details regarding Tesla’s Autopilot system can be found in the blog post, Tesla’s Autopilot Explained! According to Tesla’s guidelines, when the Autopilot is engaged, it does not make the vehicle safe to operate without a human driver behind the wheel.

From the Press

There has been a lot of coverage regarding Teslas investigations with headlines like:

All these articles explain the intensity of many of these crashes and the extent of data requested by NHTSA from Tesla regarding the capabilities of the Autopilot system. More specifically, regarding the first ever charge because of Autopilot, many coverages have also raised questions and explain why self-driving cars are still far from reality:

Meanwhile, Tesla’s marketing has also been a concern for a very long time. Tesla’s CEO has continued to make false and overinflated claims regarding Autopilot’s capabilities. This is evident in these two tweets.

There is also a lot of discussion in the press regarding the hype and buzz created by Tesla:

From the Experts

Prof. Phil Koopman, at Carnegie Mellon University, noting that NHTSA has requested information about Tesla’s entire Autopilot-equipped fleet, says:

This is an incredibly detailed request for huge amounts of data. But it is exactly the type of information that would be needed to dig into whether Tesla vehicles are acceptably safe.

Sam Abuelsamid, an expert in self-driving vehicles and principal analyst at Guidehouse Insights writes about self-driving cars frequently. He said that while partial automated driving systems can slow down cars when required, most vehicles are designed to ignore stationary objects when traveling at more than 40 mph, so they don’t slam on the brakes when approaching overpasses or stationary objects on the side of the roads, such as a car stopped on the shoulder. He also comments:

When it works, which can be most of the time, it can be very good. But it can easily be confused by things that humans would have no problem with. Machine visions are not as adaptive as humans. And the problem is that all machine systems sometimes make silly errors.

Gary Marcus, founder and CEO of Robust.AI has often voiced his concerns regarding the hype that Tesla creates regarding their Autopilot system.

Tesla’s Autopilot heavily depends on sensors like cameras and AI algorithms to infer the car’s surroundings, like location and speeds of other vehicles, through these sensors. A large chunk of the autonomous driving community also heavily questionthat the self driving capabilities can be achieved solely with data-driven algorithms. Chris Ursman, head of self-driving startup Aurora, said that his company combines AI with other technologies to come up with systems that can apply general rules to novel situations, as a human would. On the timeline for self-driving in the near future, he notes:

We’re going to see self-driving vehicles on the road doing useful things in the next couple of years, but for it to become ubiquitous will take time.

Melanie Mitchell, a computer scientist and professor of complexity at the Santa Fe Institute notes in her article, Why AI is Harder Than We Think, that AI alone is not sufficient for self-driving cars:

Getting to autonomous vehicles the old-fashioned way, with tried-and-true systems engineering, would still mean spending huge sums outfitting our roads with transponders and sensors to guide and correct the robot cars. And they would remain limited to certain areas, and certain weather conditions—with human teleoperators on standby should things go wrong, she adds._

Our Perspective

Autonomous driving has progressed significantly in the last decade, but the general consensus is that there is still a long road ahead to achieving a fully automated self-driving car that will not require any human driver behind the wheel. We refer readers to Rodney Brooks’s (CTO and co-founder of blog, Predictions Scorecard, 2022 January 01 where he makes annual predictions about self-driving cars.

Regarding the current investigations into Tesla’s Autopilot system, we believe it is important that the investigation be thorough and can yield important and actionable insights, which, hopefully, can improve the current autonomous driving system. We hope that the following questions can be answered with these investigations:

  • How does Autopilot detect emergency vehicles, flashing lights, road cones put in place for temporary road and traffic maintenance?
  • It was evident that most of these crashes happened in the dark. Has this apparent flaw been fixed in recent Autopilot updates?
  • Going forward, can we anticipate and fix similar problems in self-driving systems?

It is inevitable in the near future that there will be many more driver-assist programs with varying levels of autonomy and features. For self-driving cars to co-exist with other human drivers on shared roads and conditions, it is important for regulatory bodies like NHTSA to come up with guidelines for evaluating and testing the upcoming self-driving systems. Moreover, these guidelines be made with inputs from experts in industry and academia.

Simultaneously, while these systems are in nascent stages, it is important to ensure that companies inform consumers of the limitations of the existing assisted driving programs. Customers should also be held accountable if they do not adhere to the company guidelines. We believe that if all the stakeholders, regulatory bodies, self-driving companies, experts in the area and the consumers can play their role, then these investigations will only help for the better.


Autonomous driving is one of the most anticipated technologies in this century. From academic institutions to industry labs, researchers to practitioners to executives are continuously improving perception, prediction, and planning capabilities of autonomous vehicles by testing their software in real-world environments. Despite the amount of progress made, there is still a lot of room for improvement, as the greatest challenge is to anticipate the long tail of real-world situations and react accordingly. Companies that work on autonomous driving should conduct more thorough testing and accurately inform consumers about the limitations of such systems. Otherwise, it will take a long time and many more accidents for self-driving cars to be considered acceptably safe.

More like this
Follow us
Get more AI coverage in your email inbox: Subscribe