Federal Investigation Reveals Safety Concerns with Tesla’s Autopilot Technology

Federal authorities found that Tesla’s Autopilot system has a “critical safety gap” that caused at least 467 crashes, with 13 being deadly and “many others” resulting in serious injuries.

The National Highway Traffic Safety Administration (NHTSA) analyzed 956 crashes involving Tesla Autopilot. The results of the nearly three-year investigation were published on Friday.

According to the NHTSA report, Tesla’s Autopilot design allows for misuse and avoidable crashes because it doesn’t make sure drivers pay enough attention and use it correctly.

The agency is starting a new investigation into the effectiveness of a software update Tesla issued in December as part of a recall. This update was supposed to fix Autopilot problems the NHTSA found during the investigation.

Federal Investigation Reveals Safety Concerns with Tesla's Autopilot Technology
Senators urge NHTSA to limit Autopilot to designated roads following safety concerns.

The recall, done through a software update over the air, affected 2 million Tesla vehicles in the U.S. and aimed to improve driver monitoring systems in Teslas with Autopilot.

NHTSA suggested in its report on Friday that the software update may not have been enough, as more crashes related to Autopilot keep happening.

In a recent case, a Tesla driver in Snohomish County, Washington, hit and killed a motorcyclist on April 19. The driver told the police he was using Autopilot at the time of the crash.

NHTSA Findings and Future Directions

The NHTSA findings are the latest in a series of reports questioning the safety of Tesla’s Autopilot technology, which Tesla has promoted as a key feature that sets it apart from other car companies.

Tesla settles lawsuit from family of engineer killed in Autopilot crash, terms undisclosed. (Credits: iStock)

On its website, Tesla explains that Autopilot is meant to reduce the driver’s workload with advanced cruise control and automatic steering. Tesla hasn’t responded to the NHTSA report, and they didn’t reply to requests for comment from various sources.

After the NHTSA report came out, Senators Edward J. Markey and Richard Blumenthal issued a statement asking federal regulators to make Tesla limit Autopilot to certain roads.

On Tesla’s website, the Owner’s Manual warns against using Autopilot in areas with bicyclists or pedestrians. The senators urged the agency to take steps to prevent Tesla vehicles from putting lives at risk.

Earlier this month, Tesla settled a lawsuit from the family of Walter Huang, an Apple engineer who died in a crash while using Autopilot. Despite these events, Tesla and CEO Elon Musk have shown they’re putting a lot of focus on autonomous driving.

Elon Musk emphasizes Tesla’s future focus on autonomous driving despite safety controversies. (Credits: iStock)

Musk said during Tesla’s earnings call that they’re committed to solving autonomy, and they’re confident they’ll do it.

For years, Musk has promised that Tesla cars would become self-driving with a software update. However, Tesla’s cars currently offer only driver assistance features, not full self-driving.

Musk has also made safety claims about Tesla’s driver assistance systems without allowing outside experts to review the company’s data.

Philip Koopman, a safety researcher at Carnegie Mellon University, thinks Tesla’s marketing is misleading. He hopes Tesla takes the NHTSA’s concerns seriously.

Koopman says people are dying because they trust Autopilot too much. He suggests Tesla could make simple changes to improve safety, like restricting Autopilot to certain roads based on maps or making sure drivers pay attention while it’s in use.

Sajda Parveen
Sajda Parveen
Sajda Praveen is a market expert. She has over 6 years of experience in the field and she shares her expertise with readers. You can reach out to her at [email protected]
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x