US jury to rule on crash test lawsuit involving Tesla Autopilot

US jury to rule on crash test lawsuit involving Tesla Autopilot

In what appears to be the first trial involving a crash employing Tesla’s Autopilot partially automated driving software, a jury in a California state court started deliberating on Thursday.

As Tesla Inc. tests and deploys its more sophisticated “Full Self-Driving (FSD)” system, which Chief Executive Elon Musk has hailed as essential to the future of his business but has attracted regulatory and legal scrutiny, the verdict might provide a significant indication of the danger the company is facing.

Los Angeles resident Justine Hsu filed a lawsuit against the manufacturer of electric vehicles in 2020, claiming that when her Tesla Model S hit a curb while it was in Autopilot, the airbag was deployed “so violently that it fractured Plaintiff’s jaw, knocked out teeth, and caused nerve damage to her face.”

She seeks more than $3 million in damages for the purported flaws and other claims, alleging that the design of the airbag and Autopilot is flawed.

Tesla disputes responsibility for the accident in 2019. It claimed in a court document that Hsu utilized Autopilot on city streets despite a warning from Tesla’s user manual not to.

Tesla refers to its driver-assist capabilities as Autopilot or Full Self-Driving, but maintains that these functions do not render the vehicles autonomous and that human drivers must always be “prepared to take over at any moment.”

The autopilot was released by the EV manufacturer in 2015, and the first tragic accident in the US was reported in 2016, but the matter was never brought to court.

Three Tesla engineers testified during the recent trial, which has not been covered by other media outlets and took place in Los Angeles Superior Court.
It comes at a crucial time for the business as it prepares for a flurry of additional testing involving the semi-automated driving system, which Musk has argued is safer than human drivers, to begin this year.

Anum Arshad, the attorney for Hsu, claimed during closing arguments on Thursday that one of Tesla’s own expert witnesses conceded Autopilot could not work as the company claimed.

“Tesla continues to insist that its cars are the safest ones on the road. You only need to use common sense to decide this issue. Justine didn’t look as good as the automobile did, she claimed.

Michael Carey, an attorney for the automaker, claimed that Hsu drove directly into the median despite having plenty of time to observe it.

He claimed that the proof of distraction was simple to see.
It is a test case because it would act as a bellwether to assist Tesla and other plaintiffs’ lawyers polish their strategy, experts say, even if the trial’s verdict won’t be legally binding in subsequent instances.

Early cases “give an indication of how later cases are likely to go,” according to Cassandra Burke Robertson, a professor at the Case Western Reserve University School of Law who has examined self-driving car liability.

The National Highway Traffic Safety Administration and the U.S. Justice Department are both looking into Tesla for making false statements about the technology’s safety and self-driving capabilities, respectively.

Who is to blame for an accident that occurred while a car was in driver-assist Autopilot mode—the human driver, the machine, or both?—is the main query in Autopilot instances. According to Hsu’s claim, even though she had her hands on the wheel and was paying attention, the Tesla vehicle hit the curb so quickly that she had no chance to escape it.

According to evidence by a senior engineer, a 2016 film used by Tesla to advertise its self-driving technology was produced to illustrate capabilities – such as halting at a red light and accelerating at a green light – that the system did not have.

The information regarding the video came from a Tesla executive’s deposition in another case.

The director of Tesla’s Autopilot software, Ashok Elluswamy, testified about the videotape last week during the Hsu trial. Hsu’s attorney, Arshad, claimed in her closing remarks that Elluswamy also acknowledged that Tesla’s sensors don’t always detect when a driver’s hands are on the wheel.

The airbag is a separate issue in the Hsu trial.

According to the plaintiff’s attorney, the airbag should not have deployed in these circumstances and did so with significantly more power than necessary.

According to Bryant Walker Smith, an assistant professor at the University of South Carolina School of Law, a finding in favor of the plaintiff would probably be more significant than a victory for Tesla, particularly if the jury determines that Tesla misled Hsu.

“All of the actual or alleged issues with Autopilot, from faulty performance to driver distraction to misrepresentation, could become an order of magnitude greater with FSD,” the author said. “Thus, consider the Autopilot litigation as a preview of what could come next.”


Leave a reply

Your email address will not be published. Required fields are marked *

cool good eh love2 cute confused notgood numb disgusting fail