Elon Musk claims there has been no crash in Tesla’s Full Self-Driving Beta over a year into the program

Entertainment

Elon Musk claims that Tesla has not had a single crash in its Full Self-Driving Beta program since the start over a year ago, but that’s just as much proof that the testers are being careful as a proof that the system is safe.

“Full Self-Driving Beta” (FSD Beta) is an early version of Tesla’s self-driving software that is currently being tested by a fleet of Tesla owners selected by the company and through its “safety test score.“

The software enables the vehicle to drive autonomously to a destination entered in the car’s navigation system, but the driver needs to remain vigilant and ready to take control at all times. Tesla started the program in October 2020, and it has now pushed the software to several thousands of customers.

The test program has been criticized for putting advanced autonomous features in the hands of customers and leaving the responsibility with them by calling it a level two autonomous system in Beta testing – Tesla has defended itself by saying that it has been careful with slowly rolling out the features to customers it deems “safer drivers”.

In response to a comment by Tesla shareholder Ross Gerber on Twitter, CEO Elon Musk confirmed yesterday that Tesla believes there still has been any accident in the Full Self-Driving Beta program over a year after the launch:

It would mean that he is disputing a previous crash report to the National Highway Traffic Safety Administration (NHTSA). A Model Y owner in the FSD Beta claimed in a complaint to NHTSA that the system caused a crash, but the complaint couldn’t be confirmed.

If the report was indeed inaccurate, it is impressive that Tesla didn’t have an accident in likely millions of miles on FSD Beta. NHTSA says that on average there’s an accident every 500,000 miles for human drivers (aka all drivers).

Electrek’s Take

While impressive, it is probably more proof that Tesla owners in the FSD Beta program are being careful than the system itself is safe because we have seen plenty of videos where the FSD Beta would have caused an accident if it wasn’t for the driver taking control.

It’s a “so far so good” situation, but we know that accidents are inevitable. Once one happens, I expect to see a significant ramp-up in criticism of Tesla’s approach to testing its self-driving system. In the meantime, Tesla is enjoying the use of a lot of data from a test fleet that is not only for free, but is made of customers who paid a lot of money to test the system.

We can argue if this is right or not, but you can’t argue that as a business, this is one hell of a move.


Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.

Products You May Like

Articles You May Like

Voice behind top Simpsons character quits after 35 years on hit show
The drive to win a Rugby World Cup three-peat binds the Springboks’ DNA
Qualcomm says it expects $4 billion in PC chip sales by 2029, as company gets traction beyond smartphones
Saldivar’s Trucking: first owner-operator to deploy Volvo VNR Electric semi
UK will double aid to Sudan to ease ‘unimaginable suffering’

Leave a Reply

Your email address will not be published. Required fields are marked *