A car equipped with driver-assistance technology rear-ends a delivery truck on the Long Island Expressway. The driver claims the system failed to brake. Vehicle data logs show the driver received three visual warnings and never touched the wheel for forty-two seconds before impact.
The insurance company denies the claim, pointing to the vehicle manufacturer. The manufacturer's legal team argues the driver ignored multiple alerts. Meanwhile, the injured delivery driver sits in a hospital bed, wondering who pays for spinal surgery.
These cases are far more complex than traditional motor vehicle crashes. When determining who is at fault in a self-driving car accident in New York, liability may rest with the human operator, the vehicle manufacturer, the software developer, the maintenance provider, or multiple parties simultaneously.
Key Takeaways for Fault in a Self-Driving Car Accident
- New York currently requires licensed safety drivers for autonomous vehicle testing, meaning a human operator remains legally responsible for the vehicle's operation even when driver-assistance features are active
- Product liability claims against manufacturers require proving a design defect, manufacturing defect, or failure to warn
- Comparative fault rules in New York mean multiple parties can share responsibility for a single crash, and fault can be apportioned among them, but depending on the type of damages and the claims involved, one defendant may still be required to pay more than its percentage share
- New York's no-fault insurance system still applies to self-driving car accidents, covering medical expenses and lost wages regardless of fault up to policy limits, but serious injuries may allow claims beyond no-fault coverage
- Preservation of vehicle data logs, software version information, sensor calibration records, and over-the-air update history should be explored soon after a crash
Autonomous Vehicle Levels and Legal Responsibility in New York
The term "self-driving car" covers a wide range of technology, from basic cruise control to fully autonomous vehicles that require no human input. The Society of Automotive Engineers defines six levels of vehicle automation, and these levels directly affect who bears legal responsibility when a crash occurs.
What Do "Levels" of Automation Actually Mean?
Level 0 through Level 2 systems require constant human supervision. Tesla Autopilot, General Motors Super Cruise, and similar features may fall into Level 2—the vehicle can control steering and speed simultaneously, but the human driver must remain engaged and ready to take over at any moment. Level 3 systems allow the human to disengage under certain conditions, but the vehicle will request human takeover when needed. Level 4 systems operate without human intervention in defined geographic areas. Level 5 represents full automation under all conditions.
Why New York's Safety Driver Requirement Matters for Your Case
New York currently requires a licensed safety driver with hands ready to take control for any autonomous vehicle testing on public roads, regardless of the technology's capability level. In New York, the driver of a vehicle equipped with Level 2 technology remains the operator of the vehicle under Vehicle and Traffic Law, which means they typically bear primary responsibility for safe operation. The question becomes whether that driver acted negligently, whether the vehicle's technology failed, or whether both factors contributed to the collision. At the end of the day, it is the human operator who can overrule autonomous operation to ensure safe passage.
When the Human Driver Bears Fault for AV Crashes
New York law holds vehicle operators responsible for maintaining control and following traffic rules. When driver-assistance technology is active, the human behind the wheel remains legally obligated to supervise the system and intervene when necessary.
Liability often falls primarily on the human operator when they:
- Ignore system warnings or alerts — Event data recorders capture whether the driver received visual or audible warnings about system limitations or the need to take control
- Use features outside their intended operational design domain — Engaging Autopilot on a winding rural road clearly unsuitable for the technology, then looking down at a phone while the vehicle drifts into oncoming traffic
- Fail to maintain attention as required by the vehicle's manual — Not keeping hands on the wheel or eyes on the road despite clear owner's manual instructions
- Don't respond when the system requests human takeover — Ignoring alerts that the driver must resume control immediately
Vehicle owners also face potential vicarious liability under New York Vehicle and Traffic Law § 388, which holds registered owners responsible for injuries caused by their vehicle's operation with their permission.
Autonomous Car Product Liability and Manufacturer Responsibility
When autonomous vehicle technology fails, injured parties may pursue product liability claims against the manufacturer. New York recognizes strict product liability.
Design Defects
For example, the vehicle's autonomous system was inherently dangerous as designed. If LiDAR sensors consistently fail to detect pedestrians in crosswalks under certain lighting conditions, and the manufacturer knew or should have known about this limitation, the design itself becomes the basis for liability.
Manufacturing Defects
For example, problems in how a specific vehicle was built, even if the overall design was sound. A sensor installed incorrectly at the factory, a camera lens with a manufacturing flaw, or improper calibration during assembly could constitute manufacturing defects that cause crashes.
Failure to Warn
The manufacturer didn't adequately inform users about the system's limitations. If marketing materials suggest full autonomy but the owner's manual buried in fine print warns against using the feature in heavy traffic, the disconnect between marketing promises and disclosed limitations may support liability.
Multiple Parties and Shared Liability for Self-Driving Car Accidents
Self-driving car accidents often involve more parties than traditional crashes. Beyond the driver and manufacturer, liability may extend to:
- Software developers — Third-party companies that license autonomous driving software to multiple automakers; if that software makes a catastrophic decision, causing crashes across different vehicle brands, the developer becomes a distinct defendant
- Sensor suppliers — Component manufacturers whose LiDAR, radar, or camera systems failed at the moment of impact
- Fleet operators — Companies running autonomous ride-hailing services; fleet operators testing autonomous vehicles in New York must carry at least $5 million in insurance coverage under New York DMV requirements
- Maintenance providers — Repair shops that performed windshield replacement, bodywork, or other service but failed to properly recalibrate forward-collision cameras or other sensors
New York's comparative fault framework allows juries to apportion responsibility among multiple defendants. Fault can be apportioned among multiple defendants, but each defendant does not always pay strictly in proportion to its percentage of fault. New York law can impose broader responsibility in motor vehicle cases and treats non-economic and economic losses differently.
Evidence Preservation and Investigation
The first hours after a self-driving car accident determine what evidence survives for future litigation. Vehicle data logs, sensor footage, and software state information exist in temporary storage that may be overwritten by subsequent driving or system updates.
Critical evidence after a self-driving vehicle crash includes:
Vehicle data logs. Event data recorders must be downloaded before the vehicle undergoes repairs; this information may be overwritten or lost if not secured immediately
Sensor footage. Camera, radar, and LiDAR recordings from the seconds before impact should be extracted and preserved before system updates or normal operation overwrites them
Software version logs. Documentation of what code was running at the time of collision, including any recent over-the-air updates that may have altered system behavior
Sensor calibration records. Documentation from before and after the crash showing whether the vehicle's systems were functioning properly
Preservation letters put parties on notice and can help establish that evidence must be preserved. A legal duty to preserve evidence generally arises when litigation is pending or reasonably foreseeable, but the scope of that duty is fact-specific. New York autonomous vehicle testing permits also require companies to maintain detailed records and report crashes to the DMV.
New York No-Fault Insurance and Autonomous Vehicles
New York's no-fault insurance system applies to crashes involving autonomous and semi-autonomous vehicles just as it does to traditional accidents. No-fault benefits cover the first $50,000 in medical expenses and lost earnings for anyone injured in a motor vehicle accident in New York, regardless of who caused the crash.
The serious injury threshold under New York Insurance Law § 5102(d) determines when injured parties can step outside the no-fault system to sue for pain and suffering and other damages.
When injuries meet this threshold, victims who can prove manufacturer liability, driver negligence, or shared fault among multiple parties may recover damages far exceeding no-fault benefits, including compensation for pain and suffering, future medical needs, permanent disability, and lost earning capacity.
Determining Fault in Common Self-Driving Car Accident Scenarios
Different crash types reveal different liability patterns when autonomous technology is involved. Understanding how fault gets determined in common scenarios can help injured parties recognize what evidence matters most in their cases.
Rear-End Collisions With Adaptive Cruise Control
Vehicles equipped with adaptive cruise control or automatic emergency braking should maintain safe following distances and brake automatically when detecting stopped vehicles ahead. Data logs showing the distance to the vehicle ahead, the rate of closure, whether automatic braking is engaged, and whether the driver received alerts all factor into fault determination.
If the system issued multiple warnings that the driver ignored, human fault could become primary. If sensors failed to detect the stopped vehicle despite clear conditions, manufacturer liability may be present.
Lane Departure Crashes Despite Active Lane-Keeping
Lane departure accidents, where a vehicle drifts out of its lane despite lane-keeping assistance being active, may indicate sensor failures, unclear lane markings that the system couldn't read, or driver inattention that prevented timely correction.
Whether the system issued takeover requests and whether the driver responded becomes central to liability.
Intersection Crashes and Autonomous Navigation Limits
Intersection collisions involving vehicles attempting autonomous navigation through complex traffic situations often reveal limitations in how current technology handles unpredictable human behavior.
If a vehicle with autonomous features proceeds into an intersection despite an oncoming vehicle running a red light, fault may depend on whether the system should have detected the approaching vehicle and whether a human driver would have made the same decision.
Pedestrian and Cyclist Detection Failures
Accidents involving pedestrians and cyclists raise serious liability questions when vehicles equipped with detection systems fail to recognize vulnerable road users.
Proving that sensors malfunctioned or that software made incorrect predictions about pedestrian movement often requires legal professional analysis of system performance and comparison to industry standards.
FAQ for Fault in a Self-Driving Car Accident
Can I sue the car manufacturer if Autopilot or driver-assist was on during my crash?
You may potentially pursue a product liability claim against the manufacturer if evidence shows the autonomous system malfunctioned or had a design defect that contributed to your crash. Success requires proving the technology failed, that failure caused the collision, and your injuries resulted from that collision.
Does New York require a human driver to be present in autonomous vehicles?
New York currently requires all autonomous vehicle testing on public roads to include a licensed safety driver with hands ready to assume immediate control. The state has not yet fully authorized driverless vehicle operation on public roads.
What if the other driver claims their car was driving itself and they aren't responsible?
New York law generally holds the vehicle operator responsible for safe operation even when driver-assistance features are active. The other driver's claim that "the car was driving" doesn't eliminate their legal duty to supervise the system and intervene when necessary.
How do insurance companies handle claims when both the driver and the car's technology might be at fault?
Insurance adjusters investigate all potential causes of a crash, including both human error and technology failure. When evidence suggests multiple parties contributed to the collision, New York’s comparative fault rules allow fault to be apportioned, but liability payment is not always strictly proportional under New York law (including special rules that apply in motor vehicle cases).
What happens if the car's software was updated right before the crash?
Over-the-air software updates that alter how autonomous systems behave become critical evidence if a crash occurs shortly after installation. If an update introduced new bugs, changed system parameters in ways that caused unsafe behavior, or disabled safety features, the manufacturer's decision to deploy that update may constitute negligence or create product liability exposure.
Contact William Mattar, P.C. After a Self-Driving Car Accident
Things happen quickly after a self-driving car crash. Data logs get overwritten. Insurance companies assemble defense teams. Manufacturers hire engineers to shift blame. You deserve the same level of investigation and legal experience working for you.
Our team handles crashes involving autonomous and semi-autonomous vehicles throughout New York State, working with technical legal professionals as needed to analyze vehicle data logs, sensor performance, and software behavior to prove fault and build strong claims.
Hurt by an autonomous vehicle? Contact William Mattar, P.C. for a free case evaluation. Phones are answered 24/7.
Past performance does not guarantee future results, including financial results or client satisfaction.