Tesla ordered by auto regulators to provide data on ‘Elon mode’ Autopilot configuration

Technology

In this article

SpaceX, Twitter and electric car maker Tesla CEO Elon Musk looks on as he speaks during his visit at the Vivatech technology startups and innovation fair at the Porte de Versailles exhibition center in Paris, on June 16, 2023. 
Alain Jocard | Afp | Getty Images

Tesla has received a special order from federal automotive safety regulators requiring the company to provide extensive data about its driver assistance and driver monitoring systems, and a once secret configuration for these known as “Elon mode.”

Typically, when a Tesla driver uses the company’s driver assistance systems — which are marketed as Autopilot, Full Self-Driving or FSD Beta options — a visual symbol blinks on the car’s touchscreen to prompt the driver to engage the steering wheel. If the driver leaves the steering wheel unattended for too long, the “nag” escalates to a beeping noise. If the driver still does not take the wheel at that point, the vehicle can disable the use of its advanced driver assistance features for the rest of the drive or longer.

As CNBC previously reported, with the “Elon mode” configuration enabled, Tesla can allow a driver to use the company’s Autopilot, FSD or FSD Beta systems without the so-called “nag.”

The National Highway Traffic Safety Administration sent a letter and special order to Tesla on July 26, seeking details about the use of what apparently includes this special configuration, including how many cars and drivers Tesla has authorized to use it. The file was added to the agency’s website on Tuesday and Bloomberg first reported on it.

In the letter and special order, the agency’s acting chief counsel John Donaldson wrote:

“NHTSA is concerned about the safety impacts of recent changes to Tesla’s driver monitoring system. This concern is based on available information suggesting that it may be possible for vehicle owners to change Autopilot’s driver monitoring configurations to allow the driver to operate the vehicle in Autopilot for extended periods without Autopilot prompting the driver to apply torque to the steering wheel.”

Tesla was given a deadline of Aug. 25 to furbish all the information demanded by the agency, and replied on time but they requested and their response has been granted confidential treatment by NHTSA. The company did not immediately respond to CNBC’s request for comment.

Automotive safety researcher and Carnegie Mellon University associate professor of computer engineering Philip Koopman told CNBC after the order was made public, “It seems that NHTSA takes a dim view of cheat codes that permit disabling safety features such as driver monitoring. I agree. Hidden features that degrade safety have no place in production software.”

Koopman also noted that NHTSA has yet to complete a series of investigations into crashes where Tesla Autopilot systems were a possible contributing factor including, a string of “fatal truck under-run crashes” and collisions involving Tesla vehicles that hit stationary first responder vehicles. NHTSA acting administrator Ann Carlson has suggested in recent press interviews that a conclusion is near.

For years, Tesla has told regulators including NHTSA and the California DMV that its driver assistance systems including FSD Beta are only “level 2” and do not make their cars autonomous, despite marketing them under brand names that could confuse the issue. Tesla CEO Elon Musk who also owns and runs the social network X, formerly Twitter, often implies Tesla vehicles are self-driving.

Over the weekend, Musk livestreamed a test drive in a Tesla equipped with a still-in-development version of the company’s FSD software (v. 12) on the social platform. During that demo, Musk streamed using a mobile device he held while driving and chatting with his passenger, Tesla’s head of Autopilot software engineering Ashok Elluswamy.

In the blurry video stream, Musk did not show all the details of his touchscreen or demonstrate that he had his hands on the steering yoke ready to take over the driving task any moment. At times, he clearly had no hands on the yoke.

His use of Tesla’s systems would likely comprise a violation of the company’s own terms of use for Autopilot, FSD and FSD Beta, according to Greg Lindsay, an Urban Tech fellow at Cornell. He told CNBC, the entire drive was like “waving a red flag in front of NHTSA.”

Tesla’s website cautions drivers, in a section titled “Using Autopilot, Enhanced Autopilot and Full Self-Driving Capability” that “it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car.”

Grep VC managing partner Bruno Bowden, a machine learning expert and investor in autonomous vehicle startup Wayve, said the demo showed Tesla is making some improvements to its technology, but still has a long way to go before it can offer a safe, self-driving system.

During the drive, he observed, the Tesla system nearly blew through a red light, requiring an intervention by Musk who managed to brake in time to avoid any danger.

Products You May Like

Articles You May Like

Alien Plant Fossil Discovered Near Utah, Cannot Be Linked to Any Known Plant Family
Starmer ‘looking towards a better, brighter future’ in Christmas message
Trump’s pick to lead NASA made a big bet on crypto while going to space on the side
Grandmother, 80, ‘fell to her knees’ after IDF shot her six times during raid, says son
Model Y loses top sales spot, Walmart goes solar, and Montana is for the kids

Leave a Reply

Your email address will not be published. Required fields are marked *