A Florida judge, Reid Scott, has ruled that there’s “reasonable evidence” to conclude that Tesla and its CEO Elon Musk knew of defects in Autopilot systems and failed to fix them. Testimony from Tesla engineers and internal documents showed that Musk was “intimately involved” in Tesla’s Autopilot program and “acutely aware” of a sometimes-fatal defect—where Autopilot repeatedly fails to detect cross traffic, Scott wrote.
“Knowing that the Autopilot system had previously failed, had limitations” and, according to one Tesla Autopilot systems engineer, “had not been modified, Tesla still permitted the ‘Autopilot’ system to be engaged on roads that encountered areas of cross traffic,” Scott wrote.
Because a jury could perhaps consider that a “conscious disregard or indifference to the life” of Tesla drivers, Scott granted a motion to seek punitive damages to Kim Banner, whose husband Jeremy was killed in 2019 when his “Model 3 drove under the trailer of an 18-wheeler big rig truck that had turned onto the road, shearing off the Tesla’s roof,” Reuters reported. Autopilot allegedly failed to warn Jeremy or respond in any way that could have avoided the collision, like braking or steering the vehicle out of danger, Banner’s complaint said.
Seemingly most worrying to the judge, Tesla engineers told Scott that following Banner’s death in 2019 and the “eerily similar” death of another Florida driver, Joshua Brown, in 2016, the automaker did nothing to intervene or update Autopilot’s cross-traffic detection system. Tesla continues to market the Autopilot feature as safe for drivers today, Scott wrote.
“It would reasonable to conclude that [Tesla] dismissed the information it had available in favor of its marketing campaign for the purpose of selling vehicles under the label of being autonomous,” Scott wrote.
Scott noted that Tesla’s marketing of Autopilot is “important” in this case, pointing to a 2016 video that’s still on Tesla’s website, where Tesla claimed “the car is driving itself.” The video, Scott wrote, shows the car navigating scenarios “not dissimilar” to the cross-traffic encounter with a truck that killed Banner’s husband.
“Absent from this video is any indication that the video is aspirational or that this technology doesn’t currently exist in the market,” Scott wrote.
Scott’s ruling is considered a big blow to Tesla, which had denied liability for Banner’s death, arguing that human error is at fault because “its manual and the ‘clickwrap’ agreement sufficiently warned” Tesla owners of Autopilot’s limitations.
The judge wrote that Tesla would be able to make this argument at trial—which has been delayed—but at this stage, a jury presented with available evidence could reasonably find Tesla guilty of both intentional misconduct and gross negligence. The result of such a verdict could put Tesla on the hook for awarding huge damages that experts have said could also enormously damage Tesla’s reputation. At trial, Banner will likely argue that these warnings were inadequate.
Banner’s attorney, Lake “Trey” Lytal III, told Reuters that they are “extremely proud of this result based in the evidence of punitive conduct.”
Musk’s “de facto leader” of Autopilot team
Despite being called the “de facto leader” of the Autopilot team in internal Tesla emails, Musk did not provide a deposition in Banner’s case, which was one of the first times Tesla had to defend itself against claims that Autopilot was fatally flawed, Reuters reported. In August, Tesla also seemingly attempted to do damage control by seeking to “keep deposition transcripts of its employees and other documents secret,” Reuters noted.
Scott’s order has seemingly now made it clear why Tesla may have considered those depositions and documents so damning.
Tesla’s Autopilot feature has been under investigation for years. In 2021, as Tesla began experimentally testing the feature with drivers it determined were safe, the US launched a major probe after 11 Teslas crashed into emergency vehicles. That same year, Consumer Reports found that Autopilot could easily be rigged to work without a driver in the seat, raising more safety concerns.
By summer of 2022, the National Highway Traffic Safety Administration (NHTSA) had documented 273 crashes involving Teslas using Autopilot systems, and when the season turned to fall, the feds promptly opened a criminal investigation into Tesla Autopilot claims, “examining whether Tesla misled consumers, investors and regulators by making unsupported claims about its driver assistance technology’s capabilities.”