For New Yorkers, bustling roads and heavy traffic underscore the importance of roadway safety. With the rise of self-driving technologies, questions around Tesla’s Autopilot and Full Self-Driving (FSD) systems have become critical.
At William Mattar Law Offices, our experienced self-driving car accident lawyers are keeping on top of emerging data about Tesla’s self-driving technologies. Our legal team understands the implications of the ongoing NHTSA investigation and stays abreast of new findings from research on driver behavior and perception using Tesla’s Autopilot and FSD Beta systems.
With New York’s driver liability laws in mind, if you’ve been in a crash involving an autonomous vehicle, it’s important to work with a car accident attorney who knows the potential risks of these technologies. Our legal team is staying current in a world where automation is rapidly advancing.
Table of contents
- Tesla Autonomous Vehicle Recalls and Investigations
- Tesla’s 2023 Autopilot Recall Remedy Called into Question
- Challenges with Tesla’s Driver Monitoring System
- Behavior Changes and Safety Risks with Tesla’s FSD Beta: Research Insights
- Comparing Tesla to Other Vehicles: Crash Rates, Death Rates, and Safety Ratings
- The Balance Between Automation and Human Oversight
- Assessing Tesla’s Safety in Context
- Liability in Self-Driving Incidents Under New York Law
- Contact William Mattar Law Offices After a Car Accident with a Tesla Self-Driving Vehicle
Tesla Autonomous Vehicle Recalls and Investigations
In recent years, Tesla has issued several recalls related to its self-driving and driver-assistance technologies.
October 2021
Tesla recalled 11,728 vehicles due to a communication error introduced by the FSD Beta software version 10.3. This error could lead to false forward-collision warnings or unexpected activations of the automatic emergency braking system. The National Highway Traffic Safety Administration (NHTSA) was involved in this recall.
February 2022
Tesla recalled nearly 54,000 vehicles to disable the "rolling stop" feature in the FSD Beta. This feature allowed vehicles to pass through stop signs at up to 5.6 mph without coming to a complete stop. The NHTSA advised Tesla that failing to stop for a stop sign can increase the risk of a crash, prompting the recall.
February 2023
Tesla issued a recall for 362,758 vehicles equipped with the FSD Beta software. The NHTSA identified specific traffic situations where the system's behavior could increase the risk of a crash, including traveling or turning through intersections on a "stale yellow" traffic light and not stopping perceptibly at stop signs. Tesla voluntarily chose to pursue a recall to address these issues.
December 2023
Following a two-year investigation by the NHTSA, Tesla recalled 2,031,220 vehicles equipped with any version of Autosteer, a component of the Autopilot system. The NHTSA concluded that Autosteer's controls were insufficient to prevent misuse and did not ensure that drivers maintained continuous and sustained responsibility for vehicle operation. Tesla implemented an over-the-air software update to address these concerns and enhance safety in response to the recall.
October 2024 Investigation
The NHTSA initiated an investigation into Tesla’s FSD system following reports of four crashes, including one fatal pedestrian accident. The incidents reportedly involved Tesla vehicles in FSD mode that failed to respond appropriately to poor visibility conditions caused by factors like sun glare, fog, or airborne dust. One crash resulted in a pedestrian fatality and another involved injury.
The investigation covers an estimated 2.4 million Tesla vehicles across various models, including the Model S, Model X, Model 3, Model Y, and the newly released Cybertruck, all equipped with the FSD system. The NHTSA aims to assess whether Tesla’s FSD system can reliably detect and respond to visibility impairments, determine if other similar crashes have occurred under reduced visibility, and identify any updates to FSD that may affect performance in these conditions.
Tesla’s 2023 Autopilot Recall Remedy Called into Question
Tesla’s Autopilot, a Level 2 driving assistance system, was designed primarily for highways but has been used increasingly on urban roads. The December 2023 recall implemented a software update to increase driver alerts, aiming to keep drivers engaged and ready to take control if needed.
Despite these updates, 20 new crashes were reported involving Teslas on Autopilot, leading NHTSA to question the recall’s effectiveness. Tesla's Autopilot monitoring system relies primarily on detecting torque on the steering wheel, a measure criticized for failing to accurately gauge driver engagement.
NHTSA is evaluating whether Tesla’s remedy should have included more robust safety features to ensure that drivers remain attentive, particularly in urban settings where unexpected situations demand prompt human intervention.
Challenges with Tesla’s Driver Monitoring System
Tesla’s driver monitoring approach—detecting torque on the steering wheel—has been widely criticized for its limitations. Many experts argue that this measure is insufficient, as drivers can find ways to circumvent the system, like using weights to simulate hand pressure. Furthermore, the study highlighted that Tesla’s system lacks night-vision capabilities and does not detect if drivers are looking away from the road.
The NHTSA is investigating whether Tesla’s updates should have included more stringent monitoring methods, such as using camera-based systems to track eye movement and ensure that drivers are watching the road. Without such measures, the potential for dangerous behavior, such as using the system on roads not designed for it, remains high.
Behavior Changes and Safety Risks with Tesla’s FSD Beta: Research Insights
Recent research provides a deeper look into driver behavior when using Tesla’s Autopilot and FSD Beta systems, revealing critical insights into both complacency and increased risk-taking behaviors. The study involved interviews with 103 Tesla owners using either standard Autopilot or FSD Beta, exploring how prolonged exposure to these technologies affects driver awareness, engagement, and actions behind the wheel.
The findings indicate a range of potentially dangerous behaviors among Tesla drivers over time:
Increased Driver Complacency
As drivers grow accustomed to Autopilot, they are more likely to engage in risky behaviors like hands-free driving or even mind-wandering. This reliance on Autopilot can lead to reduced attentiveness, increasing the risk of accidents when unexpected situations arise.
Stress and Workload in FSD Beta Users
FSD Beta, an evolving technology that allows Teslas to navigate non-highway roads, requires constant driver supervision due to its unpredictability. The study found that drivers using FSD Beta experience higher stress and mental workload, as they must remain ready to correct potentially unsafe maneuvers by the system. This increased vigilance can lead to mental fatigue, which undermines the potential safety benefits of automation.
Adaptation to Automation Over Time
Over time, drivers tend to develop a more relaxed posture and are less attentive to the system's actions. While initially cautious, users often become over-reliant on Autopilot and FSD, especially in controlled settings like highways, where the technology performs relatively smoothly. This adaptation may contribute to the “out-of-the-loop” phenomenon, where drivers lose situational awareness and become slower to react when the system fails.
The study suggests that more advanced monitoring could be critical to prevent misuse and maintain driver engagement.
Comparing Tesla to Other Vehicles: Crash Rates, Death Rates, and Safety Ratings
As Tesla’s Autopilot and FSD Beta technology evolve, it raises critical questions regarding the safety of Tesla vehicles relative to traditional, non-automated cars. The questions of crash rates, death rates, and safety ratings are especially pertinent in light of ongoing safety investigations by the NHTSA and recent studies on driver behavior with Tesla’s driver-assistance systems.
Crash Rates: How Does Tesla Compare?
Tesla frequently touts its self-driving systems as reducing accident rates compared to traditional vehicles. In Tesla’s Vehicle Safety Report, the company reported that vehicles with Autopilot engaged had a crash rate of one per 6.26 million miles driven, compared to one per 1.71 million miles driven for Teslas without Autopilot and one per 484,000 miles for average U.S. vehicles.
While these figures suggest that Autopilot may reduce the likelihood of a crash, they are based on Tesla’s internal data, which may lack the objectivity and granularity that independent studies could provide.
However, NHTSA’s crash data offers a more cautionary perspective. According to NHTSA reports, Tesla vehicles with self-driving features have been involved in significantly more crashes than similar vehicles with traditional driving systems, possibly due to the high number of Teslas on the road equipped with these technologies. Factors like driver behavior and overreliance on Autopilot may also contribute to these figures.
For instance, Tesla vehicles were found to be disproportionately involved in crashes when using Autopilot compared to other semi-autonomous systems on the market. Given the variability of self-reported versus independent crash data, more comprehensive analysis may be needed to understand Tesla’s true crash rate relative to comparable cars.
Death Rates: Are Tesla Vehicles Safer in Fatality Statistics?
Tesla’s vehicles often score well in traditional safety ratings focused on crash resistance, occupant safety, and survivability. For example, the Model 3 and Model Y received top marks from IIHS and Euro NCAP for crashworthiness, emergency braking, and collision avoidance systems.
However, these ratings largely assess passive safety features—such as airbags, crumple zones, and structural durability—rather than the active safety capabilities related to Tesla’s automated systems.
While the company continues to market Autopilot and FSD systems as enhancing safety, NHTSA and other safety agencies have noted that without full driver engagement, these systems can create new risks by allowing drivers to disengage from active control, sometimes with fatal consequences.
Despite high crash resistance and driver protection ratings, Tesla’s safety scores in terms of automated driving reliability remain mixed. Drivers must be prepared to intervene immediately, as the system cannot yet manage unexpected roadway challenges with full autonomy.
The Balance Between Automation and Human Oversight
Tesla’s vehicles are often compared to others with advanced driver-assistance systems, such as General Motors’ Super Cruise and Ford’s BlueCruise. These systems, however, differ in their approach to driver monitoring and road suitability.
For instance, GM’s Super Cruise incorporates a camera-based monitoring system that tracks the driver’s eyes to ensure constant attention, a feature that experts argue may better safeguard against distraction compared to Tesla’s torque-based steering wheel sensors.
Despite Tesla’s claims that Autopilot-equipped vehicles reduce crash rates, these comparisons suggest that other manufacturers may prioritize more stringent driver monitoring, potentially leading to lower distraction-related accidents.
Ford, for example, restricts BlueCruise operation to highways only, while Tesla allows its systems to engage in more varied and often complex environments, increasing the potential for misuse or overreliance.
Assessing Tesla’s Safety in Context
Tesla’s self-driving technologies have introduced new capabilities and challenges, making comparing them directly with traditional vehicles difficult. Ultimately, Tesla drivers in New York and beyond should approach these systems with caution, understanding that while automation may offer convenience, it does not replace the need for full driver engagement.
New Yorkers sharing the road with self-driving vehicles need to be aware of these vehicles’ limitations to make informed decisions and prioritize safety. For those involved in crashes with automated vehicles, thorough documentation and legal guidance can be essential for navigating the unique complexities of liability in these incidents.
Liability in Self-Driving Incidents Under New York Law
New York’s Vehicle and Traffic Law holds drivers fully accountable for their vehicle’s operation, even when using driver-assist technologies. This means that if an accident occurs while using Autopilot or FSD, the driver can be held liable for any damages. Given the limitations of Tesla’s technology and the potential for driver complacency, those injured in such incidents have grounds to seek legal recourse.
For New Yorkers injured in Tesla-related crashes, documenting the incident thoroughly and understanding how the technology may have contributed to the accident are critical steps. A self-driving car accident attorney can evaluate whether Tesla’s system functioned as intended and help determine liability in cases involving automated vehicle features.
Contact William Mattar Law Offices After a Car Accident with a Tesla Self-Driving Vehicle
William Mattar Law Offices offers guidance and representation for those involved in a Tesla-related crash. Our attorneys help individuals navigate the legal and technical complexities associated with self-driving accidents. We understand the evolving landscape of automation and the potential for further regulation.
New York drivers should approach these technologies with caution, remembering that attentiveness is still the most effective safety feature.
Call our main office in Williamsville, NY, at 716-444-4444 or contact us online anytime, 24/7. We also have offices in Buffalo, Syracuse, and Rochester, and our personal injury lawyers serve clients throughout New York, including Albany, Long Island, Binghamton, Watertown, Plattsburgh, and New York City.