A Tesla Supercharger station is seen in a parking lot on September 16, 2024 in Austin, Texas.
Brandon Bell | Getty Images
tesla is being sued by the family of a driver killed in a 2023 crash, alleging “fraudulent misrepresentations” about the company’s Autopilot technology.
Tesla driver Genesis Giovanni Mendoza Martinez was killed in a crash involving a Model S sedan in Walnut Creek, California. Her brother Caleb, who was a passenger at the time, was seriously injured.
The Mendoza family sued Tesla in Contra Costa County in October, but in recent days Tesla moved the case from state court to federal court in the Northern District of California. independent person This is the first time we have reported on the venue change. Plaintiffs generally face a higher burden of proof for fraud claims in federal court.
In the incident, a 2021 Model S collided with a parked fire truck while the driver was using Autopilot, Tesla’s partially autonomous driving system.
Mr. Mendoza’s lawyers argued that Tesla and Mr. Musk have over the years made exaggerated or false claims about the Autopilot system in an effort to “increase excitement about their vehicles and thereby improve their financial position.” They pointed to tweets, company blog posts, and statements made during earnings calls and press conferences.
Tesla’s lawyers responded by saying that the driver’s “own negligent acts and/or omissions” caused the crash, and that “relying on any representations made by Tesla would not have caused harm to the vehicle.” It was not a significant factor contributing to the driver or passenger. They claim Tesla’s cars and systems are “reasonably safe by design” in compliance with state and federal laws.
Tesla did not respond to a request for comment on this story. Brett Schreiber, the attorney representing the Mendoza family, declined to make his client available for an interview.
At least 15 other ongoing cases focus on similar claims related to the Tesla crash that Autopilot or its FSD (Full Self-Driving (Supervised)) was being used shortly before the fatal or injury crash. There are several lawsuits. Three of the cases were transferred to federal court. FSD is a premium version of Tesla’s partially self-driving system. Autopilot is included as a standard option on all new Tesla vehicles, but owners either pay an upfront premium or subscribe monthly to use FSD.
The accident at the center of the Mendoza v. Martinez case was also part of a broader Tesla Autopilot investigation by the National Highway Traffic Safety Administration that began in August 2021. During the course of its investigation, Tesla made changes to its systems, including: Countless over-the-air software updates.
The agency has launched a second investigation, currently underway, examining whether Tesla’s “recall remedies” to resolve issues with Autopilot operating around stationary first responders were effective. I am evaluating it.
NHTSA warned Tesla that social media posts could mislead drivers into thinking its vehicles are robotaxis. Additionally, the California Department of Transportation has sued Tesla for false advertising claims about Autopilot and FSD.
Tesla is currently rolling out a new version of FSD to customers. Masks over the weekend instructed “See if you can demonstrate Tesla self-driving to your friends tomorrow,” asked X’s more than 206.5 million followers, adding, “It’s like magic.”
Since around 2014, Mr. Musk has promised investors that Tesla cars would soon be able to drive themselves without a human at the wheel. The company has shown off a design concept for an autonomous two-seater vehicle called CyberCab, but Tesla has not yet announced it. Build a robotaxi.
Meanwhile, competitors such as China’s WeRide and Pony.ai of the alphabet Waymo in the US already operates a commercial robotaxi fleet and service.
clock: Tesla’s FSD test was ‘incredibly good’