Technology

Tesla found partially liable for a deadly 2019 crash

Florida Tesla’s jury partially found a liability for a 2019 crash, which includes the company’s self -driving feature, Washington Post Reports. As a result, the company will have to pay $ 43 million as compensation compensation and more in punitive damage.

Autopilot comes pre -installed on Tesla cars and takes over things like discovering collision and argument in emergency situations. Tesla mostly avoided taking responsibility for accidents with automatic pilot It was enabled, but the Florida case played differently. The jury eventually decided that self -driving technology enabled George McGgy to get his eyes out of the road and hit a couple, Nabel Benefids Leon and Delon Angelo, ultimately killing one of them and wounding the other severely.

During this case, Tesla’s lawyers argued that McGoy’s decision to take his eyes from the road to reach his phone was the cause of the accident, and that the automated pilot should not be considered. Prosecutors, Angelo and the Pionefeld Lyon family, argued that the way Tesla and Elon Musk spoke about the feature in the end of the end of the illusion that the automated pilot was safer than it was already. “My understanding was that he would help me if I failed … or I should make mistakes,” McCgy said on the platform. “In this case, I feel it failed.” The jury ultimate NBC News.

When it was reached for the comment, Tesla said it would appeal the decision and the following statement:

Today’s ruling is wrong and only works to restore car safety and expose Tesla to danger and the entire industry efforts to develop and implement life -saving technology. We are planning to appeal, given the major errors in the law and violations in the trial. Although this jury found that the driver was largely responsible for this tragic accident in 2019, the evidence has always shown that this driver was only wrong because he was rushing, with his foot on the accelerator – who exceeded the automated pilot – while distributing his leaked phone without his eyes on the road. In order to be clear, there was no car in 2019, and nothing today would prevent this incident. This was never about the automated pilot. It was a fantasy that was prepared by the lawyers of the prosecutors who blame the car when the driver – from the first day – confessed to acceptable responsibility.

In an administrative investigation on highways on air roads from the automatic pilot from 2024, the incidents of misuse of the Tesla system driver were blamed, not the system itself. NHTSA also found that Autopilot was excessively lenient and “did not guarantee that drivers maintain their interest on the driving mission”, which is lined with Florida 2019 crash.

Although Autopilot is only one component of the larger Tesla components of self -driving leadership, selling the idea that the company’s cars can lead safely on its own a major part of its future. Elon Musk claimed that the full self -driving (FSD), the upgrade to the automated pilot, is “safer than human leadership”. The Tesla’s Robotaxi is based on FSD’s ability to work without supervision or minimum supervision, which produced mixed results in the first few days that the service was available.

Update, Aug 1, 6:05 pm East time: This story was updated after publishing to include Tesla’s statement.

Don’t miss more hot News like this! Click here to discover the latest in Technology news!

2025-08-01 22:05:00

Related Articles

Back to top button