Tesla Faces Renewed Scrutiny Over Autopilot Functionality, Two Senators Want It Restricted

By Dabbie Davis

Apr 27, 2024 12:16 AM EDT

MAN DRIVING A TESLA
(Photo : PEXELS/Borys Zaitsev)

Federal regulators are currently investigating the effectiveness of Tesla's December recall in addressing Autopilot safety concerns. In response to mounting worries, two senators are advocating for restrictions on Autopilot use, particularly on roads where its functionality isn't suitable. Over the course of a two-year inquiry into Tesla's Autopilot system, numerous crashes, including dozens of fatalities, have been identified, prompting heightened scrutiny from regulatory bodies

Two Senators: Restrictions on Autopilot Use

US News & World Report brought attention to a significant development addressing Autopilot safety issues surrounding Tesla, citing an update from Reuters. Senators Ed Markey and Richard Blumenthal, both Democrats, have come forth to support restrictions on Autopilot use. They contend that the National Highway Traffic Safety Administration ought to step in and order Tesla to restrict the use of Autopilot to the roads for which it was created.

Markey and Blumenthal stressed how urgent it is to stop Tesla cars that have Autopilot from endangering people's safety. At the same time, the vehicle safety organization announced that an investigation into the effectiveness of Tesla's December recall had begun in order to strengthen Autopilot safeguards.

Investigation Shows Dozens of Deaths Hundreds of Crashes

According to Wired, a federal investigation found that drivers abusing Tesla's Autopilot technology resulted in at least 13 fatalities-incidents that the firm may have prevented or at least anticipated. The report highlighted Tesla as an "industry outlier" for lacking fundamental safety measures compared to competitors. Authorities are currently examining how well a new update to Tesla Autopilot addresses these issues.

The main US agency responsible for regulating road safety, the National Highway Traffic Safety Administration (NHTSA), reported that 14 people died and 49 were injured in these collisions. Government engineers scrutinized 109 "frontal plane" crashes, noting that in half of these cases, hazards were visible five seconds or more before impact, suggesting preventable outcomes with attentive driving.

One such tragic incident in March 2023 involved a Model Y colliding with a teenager exiting a school bus in North Carolina, resulting in severe injuries. The NHTSA concluded that an attentive driver could have avoided or minimized the severity of the crash, emphasizing a pattern of avoidable accidents involving hazards perceivable to vigilant drivers throughout their investigation.

READ MORE: Land Rover Defender Octa Boasts Twin-Turbo V8, Hits Road This July

According to The Verge, the investigation found that hundreds of crashes resulted in multiple injuries and fatalities due to a reoccurring pattern of driver inattention combined with Tesla's technological constraints. The National Highway Traffic Safety Administration (NHTSA) discovered that when drivers used Autopilot or its more advanced version, Full Self-Driving, Tesla's system was unable to keep drivers' attention at all times.

The NHTSA looked into 956 crashes during its study, which ran from January 2018 to August 2023 and resulted in 29 fatalities. Among these incidents, 211 crashes involved the Tesla vehicle's frontal plane colliding with obstacles or other vehicles, often leading to severe outcomes. Specifically, 14 fatalities and 49 injuries were recorded in these types of crashes, underscoring the gravity of the safety concerns surrounding Tesla's Autopilot system.

According to documents dated Thursday, as reported by The Washington Post, the National Highway Traffic Safety Administration (NHTSA) provided a comprehensive overview of the risks associated with Tesla's driver-assistance feature, citing instances where system controls and warnings proved insufficient. The agency is currently reassessing the efficacy of Tesla's corrective measures implemented during the December recall of 2 million vehicles.

Investigators revealed 20 crashes involving vehicles that had received updates as part of the recall. The recall was initiated following an NHTSA inquiry into Autopilot's safeguards to ensure driver alertness during system engagement.

Tesla, in response to the recall, addressed concerns through software updates aimed at reminding drivers to remain attentive while utilizing the automated driving system. However, Tesla opted not to restrict the system's operational scope, a decision contested by experts who suggested it would have been a more effective solution.

RELATED ARTICLE: Tesla Launches Sport Seats for Model S Plaid: Cutting Edge Comfort

Real Time Analytics