Tesla faces huge financial liability after a crash linked to its Autopilot feature. This case, seen as the first nuclear verdict against autonomous vehicle tech, could set legal precedents as manufacturers adopt more automation.
A jury in a Florida federal court found Tesla 33% responsible for $129 million in damages for a crash involving one of its vehicles. On top of damages for the two plaintiffs, Tesla also got hit with $200 million in punitive damages.
At the center of the case was Tesla’s Autopilot feature. The Level 2 autonomous system is a suite of features, including automatic steering, forward collision warning and automatic emergency braking.
On its website, Tesla states the Autopilot feature does not “make your Tesla vehicle fully autonomous or replace you as the driver.” Owners must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.”
Despite these disclaimers, Tesla drivers have used Autopilot beyond its capabilities, resulting in numerous crashes. In 2021, the National Highway Traffic Safety Administration launched an investigation into the autonomous system after identifying 11 crashes involving parked emergency vehicles.
Since its release in 2015, Autopilot has faced scrutiny. A notable incident occurred in 2016 when a Model S crashed into a tractor-trailer, killing the driver. Following federal investigations and concerns from lawmakers, Tesla had avoided legal responsibility until now.
Crash involving Autopilot
The case stems from a 2019 crash in Key Largo, Fla., that killed one person and left another seriously injured.
A Tesla going eastbound on Card Sound Road was approaching a three-way T-intersection with County Road 905. Card Sound Road ends at the intersection. Both a stop sign and a hanging traffic light sit at the intersection.
Despite sufficient signage, the driver of the Tesla drove through the intersection without stopping, crashing into a Chevrolet Tahoe legally parked on the east side shoulder of CR905, directly across from the end of Card Sound Road. Both occupants of the Tahoe were standing outside the vehicle. However, the impact left one dead and the other severely injured.
The lawsuit claims the Tesla did not detect the Tahoe or traffic signals while Autopilot was engaged. The Tesla was speeding at 70 mph in a 45 mph zone.
Plaintiffs in the case claim the Tesla driver took his eyes off the road to look at his phone as the vehicle was approaching the intersection. They argue he did that because he was relying on Autopilot.
They allege Tesla knew drivers would misuse Autopilot. The automaker failed to meet reasonable consumer expectations for lane keeping, speed matching, collision avoidance and emergency braking.
Confusing messaging
Although the Tesla driver’s inattentive driving contributed to the crash, attorneys for the plaintiffs argued that the automaker falsely marketed that its Autopilot feature was capable of doing far beyond what it could.
“For instance, just by terming the name of its product ‘Autopilot,’ Tesla created the false impression and understanding among consumers that its vehicles could drive automatically and autonomously, with reduced driver input or without any driver input,” the complaint states. “This impression, carefully crafted by Tesla to increase its sales and brand identity, was false in that the Autopilot systems are not autonomous and in fact require substantial and constant input from drivers.”
The lawsuit highlights a marketing video showing a Tesla navigating city streets without a driver. However, the video featured a vehicle with special maps not available to regular users. That vehicle crashed during filming.
When the video was released, Tesla CEO Elon Musk said the vehicle “drives itself (no human input at all) (through) urban streets.”
Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot https://t.co/V2T7KGMPBo
— Elon Musk (@elonmusk) October 20, 2016
Because of false expectations, the Tesla driver was “far less vigilant and attentive while driving than he otherwise would have (been) and placed far too much reliance on Autopilot to safely navigate and operate the subject vehicle,” the plaintiffs argued.
A jury agreed, finding Tesla placed the 2019 Model S on the market with a defect that was a legal cause of damage to the victims. They assigned one-third of the blame to Tesla and two-thirds to the driver, resulting in $129 million in damages. Tesla was also ordered to pay an additional $200 million in punitive damages.
In a social media post, Musk said Tesla will appeal the verdict.
History of controversy
The nuclear verdict follows years of controversy over Tesla’s Autopilot and Full Self-Driving features.
The first high-profile crash occurred shortly after Autopilot was launched. In May 2016, a 40-year-old former Navy SEAL was killed in a crash after his 2015 Tesla Model S drove through a tractor-trailer. The National Transportation Safety Board found the probable cause of the crash was “the truck driver’s failure to yield the right of way to the car, combined with the car driver’s inattention due to overreliance on vehicle automation.” The NTSB cited issues with the technology itself.
“Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer,” the NTSB report states.
Many subsequent crashes involving Autopilot led to investigations. A 2019 NTSB report linked the feature to a different crash, prompting calls for a recall from the Center for Auto Safety.
Two years later, NHTSA opened an investigation into Tesla’s Autopilot feature. That investigation determined that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”
“This mismatch resulted in a critical safety gap between drivers’ expectations of the (Level 2) system’s operating capabilities and the system’s true capabilities,” NHTSA found.
Tesla upgraded its autonomous technology to an even more confusing Full Self-Driving system.
In November 2020, the Owner-Operator Independent Drivers Association expressed its concerns to NHTSA. The Association raised issues about a lack of oversight for autonomous vehicles.
“(Working in) one of the most heavily regulated industries in America, it is infuriating for truckers to see a complete abdication of safety oversight by federal regulators, including NHTSA,” OOIDA President Todd Spencer stated in the letter. “Truckers must comply with dozens of regulations and can be placed out-of-service for a violation as simple as a burnt-out rear turn signal. Yet the safety of (Tesla’s Full Self-Driving system) is unproven, and NHTSA shows little interest in exerting the appropriate oversight to keep our members safe when sharing the roads with vehicles operating in (Tesla’s Full Self-Driving system).”
In October 2021, a California Tesla driver became the first person to face felony charges for a crash involving autonomous technology. He received two years of probation after pleading no contest to two counts of vehicular manslaughter.
In 2023, Tesla recalled hundreds of thousands of vehicles after issues with its Full Self-Driving software led to unsafe driving maneuvers.
Those include traveling straight through an intersection while in a turn-only lane, failing to come to a complete stop or proceeding into an intersection during a steady yellow traffic signal without due caution.
Last year, U.S. senators urged NHTSA to regulate autonomous vehicles. They noted that drivers of partially automated systems like Tesla’s Autopilot often misuse the technology.
Several months later, NHTSA launched an investigation into Tesla’s Full Self-Driving system. That investigation remains open.
A year ago, the Insurance Institute for Highway Safety released a study that found that partial automated driving systems like Autopilot may not prevent crashes. However, that study focused on BMW and Nissan systems, not Tesla’s.
Despite the crashes, lawsuits, investigations and inquiries from lawmakers and stakeholders, Tesla and other automakers have not been found legally liable for autonomous systems until now. However, the 11th Circuit Court of Appeals could reverse the precedent-setting verdict. LL
Credit: Source link
