Tesla is facing new lawsuit over Autopilot and FSD

Max McDee, 15 September 2022

A Tesla Model X owner filed a proposed class action lawsuit in San Francisco federal court against Tesla and Elon Musk. He accuses the company of deceptive and misleading marketing tactics and straight fraudulent actions over the Autopilot and the FSD system.

In the filing, the plaintiff claims he paid the $5,000 fee for Enhanced Autopilot package in 2018 while it was sold as the precursor to Full Self-Driving. The lawsuit accuses Tesla of constantly touting the technology as being “just around a corner” since 2016 while knowing the technology didn’t work or in some cases did not exist. The plaintiff accuses Tesla of knowingly making the vehicles unsafe.

The suit goes on to claim that Tesla did it just to generate excitement about the company and its vehicles in order to attract investment. It points to Tesla apparently misleading the public about the capabilities of the Autopilot to drum up sales of cars to avoid bankruptcy at an earlier stage.

Many Tesla accidents are blamed on Autopilot but often the driver was in charge Many Tesla accidents are blamed on Autopilot but often the driver was in charge

In the 84 page filing, Mr Matsko argues that Tesla is yet to produce “anything even remotely approaching a fully self-driving car.” He continues to say that FSD, now a $15,000 option, is still not ready and should’n be allowed to be sold to customers.

He follows with pointing out Elon Musk’s claims of autonomous cross-country trips being possible by 2018 and never materializing, as well as the 2019 claim about 1 million autonomous robotaxis on the roads by 2020. Elon Musk eventually gave up on the cross-country trip idea an admitted it will need a specialized route. As for robotaxis - they are apparently coming some time in 2025.

The plaintiff backs up his claim of Tesla’s fraudulent actions by using the company's 2016 Full Self-Driving video promoting the feature. We all know that clip, we all watched with amazement Model X driving through a city, dropping off its driver and then parking itself. We all wanted that car there and then.

There are actual items for sale on the Internet to bypass Tesla Autopilot safety precautions  There are actual items for sale on the Internet to bypass Tesla Autopilot safety precautions

The truth was quite murky as it turned out a bit later. According to engineers involved in the shooting of this clip, the car used a specially pre-charted and high-definition 3D mapped route. None of that technology is available on any of Tesla’s cars to this day.

The lawsuit goes on to claim the FSD and Autopilot are not just a fraud but are actually dangerous and quotes the 2018 Model X accident in which the driver died after the car crashed into a concrete barrier. Another accident quoted in the lawsuit is that of Tesla crashing into a stationary fire truck. That accident resulted in a federal investigation which was expanded over following years into 11 similar accidents.

The plaintiff is looking for a class action with punitive and restorative compensation not specified in the suit. He wants Tesla to be forced to stop the misleading marketing tactics and stop using paying customers as untrained test engineers.

This is how Tesla sees the road ahead This is how Tesla sees the road ahead

Tesla hasn't been sleeping and this lawsuit is not a surprise, the company has been preparing for a while putting one of the industry’s strongest legal teams together. Whether anything comes out of this lawsuit or not, the fact is that many people still take the Autopilot or FSD for an actual autonomous system while Tesla explicitly says it isn’t. Despite its confusing (maybe misleading) name, the FSD requires driver’s full and constant attention - and it always did.

NHTSA has opened 38 crash investigations involving Tesla and ADAS with nineteen deaths recorded in those crashes since 2016. There are serious questions that will be asked and the answers won’t be pretty. This is the “caution - hot drink” on coffee paper cups moment for Tesla and it will have to do more than just put warnings in small print.

The allegations of Tesla knowing about the Autopilot not actually working, faking its abilities and endangering people’s lives are quite serious if only can be proven. Did Elon Musk get ahead of himself? Absolutely yes, many times over. But rather than believing his every word, we should exercise a bit more common sense and a lot more caution.

Source


Related

Reader comments

Well.......... Hey its tesla so its nothing new. Just like Apple or Google bad practicies everywhere.

Autopilot is nice and all, but you can bet it would not be able to drive on most the roads and even some highways in my country ... safely or without damaging the car in the process. FSD is doing a lot of progress and I am watching it closely, but ...

  • Anonymous

Autopilot is far ahead of anything currently available, and it's perfectly usable in the situations it is intended for. Part of the delays are regulatory, and the well-publicized accidents are few and far between, often at least partially due to...

Reviews

FEATURED

Popular models