A jury in Florida found Tesla partly to blame for a fatal crash in 2019 that involved the company’s Autopilot self-driving technology. The jury ordered Tesla to pay $200 million in damages.
The accident happened when driver George McGee, who was using Tesla’s Autopilot feature, took his eyes off the road and hit a couple, Naibel Benavides Leon and Dillon Angulo. One of them died, and the other was seriously hurt.
Tesla cars come with Autopilot, which helps with things like detecting collisions and applying the brakes in an emergency. Up until now, Tesla had mostly avoided liability in accidents involving Autopilot.
But in this case, the jury decided that McGee’s trust in Tesla’s Autopilot system made him less likely to pay attention to driving, which led to the accident. Two-thirds of the blame was put on McGee, and one-third was put on Tesla.
Tesla’s lawyers said in court that McGee’s choice to reach for his phone caused the accident and that Autopilot should not be blamed.

The plaintiffs, who were the families of the victims, said that Tesla and Elon Musk’s public portrayal of Autopilot as a very safe system led drivers to overestimate how well it would work. McGee said in court that he thought Autopilot would help him if something went wrong, but he felt it didn’t in this case.
Tesla said it would file an appeal and made the following statement:
Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. The case was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.
This decision is in line with what the National Highway Traffic Safety Administration (NHTSA) found in 2024: that crashes involving Tesla’s Autopilot were usually caused by careless driving rather than problems with the system.
However, that investigation also said that Autopilot was too lax and wasn’t doing enough to make sure drivers stayed alert, which are both issues that were important to this crash.

Tesla heavily promotes the idea that its cars can safely drive themselves, even though Autopilot is only one part of the company’s larger suite of self-driving features.
CEO Elon Musk has said that Tesla’s Full Self-Driving (FSD) upgrade is “safer than driving a person.” Tesla’s planned Robotaxi service depends on FSD being able to work with little or no human supervision. However, early tests of the service have had mixed results.





