The driver of a Tesla on Autopilot involved in a fatal accident in 2019 will stand trial, charged with two counts of vehicular manslaughter, a judge ruled last Thursday. This is the first felony prosecution in the U.S. over a fatal crash involving a partially automated driving system.
Kevin George Aziz Riad is alleged to have been reckless and negligent when his Tesla Model S exited a freeway and ran a red light going 74 mph, ultimately colliding with a Honda Civic at an intersection in Gardena. Both people inside the Honda, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez, were killed. Riad and his passenger were hospitalized with non-life-threatening injuries.
Evidence shows that both Tesla’s Autosteer and Traffic Aware Cruise Control were active. A Tesla engineer testified that the car’s sensors show Riad had a hand on the steering wheel, but no brakes were applied in the six minutes before the crash, despite traffic signs warning drivers to slow near the end of the freeway.
Prosecutors argue that Riad did not attempt to stop the crash and the judge denied his attorney’s request to lower the charges to misdemeanors. The Nieves-Lopez family also claims that Riad was an unsafe driver who should not have been driving the high-performance Tesla while having multiple moving infractions on his record.
The National Highway Traffic Safety Administration (NHTSA) issued a statement saying whether or not a car is using a partially automated system, a human driver must always be in control. With that said, to what extent is Tesla responsible?
The families of Lopez and Nieves-Lopez are suing Tesla for selling defective vehicles that can accelerate suddenly and lack an effective automatic emergency braking system. If the company is found to have put a dangerous vehicle on the road, Tesla could be culpable. The joint trial is scheduled for mid-2023.
Autopilot-related crashes with Teslas have been reported since 2016. As a result, the NHTSA has investigated 26 crashes involving the company’s Autopilot. Over the past few years, Tesla has updated their software to try to make it harder for drivers to misuse.
In the past, Tesla has had to clarify that the Autopilot system, which can control steering, speed and braking, cannot drive itself and requires full attention from the motorist. Despite stating that the vehicle is not autonomous, Tesla has been accused of being misleading by naming the system “Autopilot” in the first place, and introducing new software as “Full Self-Driving” when the vehicle is not.
Although the company has attempted to reduce the number of crashes involving the Autopilot system, just earlier this month, a fatal accident in Newport Beach involved a 2022 Tesla Model S that hit a curb and slammed into construction equipment, totaling the car and killing three people. The NHTSA is investigating this crash as the vehicle may have been operating on Autopilot.
If you or someone you know was injured in an accident involving partial autopilot technology, contact the skilled personal injury lawyers at Custodio & Dubey LLP. With over 25 years of experience, our lawyers will guide you at every step of the way to help you receive the justice you deserve.