Tesla has received a special order from federal automotive safety regulators requiring the corporate to provide extensive data about its driver assistance and driver monitoring systems, and a once secret configuration for these referred to as “Elon mode.”
Typically, when a Tesla driver uses the corporate’s driver assistance systems — that are marketed as Autopilot, Full Self-Driving or FSD Beta options — a visible symbol blinks on the automotive’s touchscreen to prompt the motive force to engage the steering wheel. If the motive force leaves the steering wheel unattended for too long, the “nag” escalates to a beeping noise. If the motive force still doesn’t take the wheel at that time, the vehicle can disable using its advanced driver assistance features for the remaining of the drive or longer.
As CNBC previously reported, with the “Elon mode” configuration enabled, Tesla can allow a driver to use the corporate’s Autopilot, FSD or FSD Beta systems without the so-called “nag.”
The National Highway Traffic Safety Administration sent a letter and special order to Tesla on July 26, looking for details in regards to the use of what apparently includes this special configuration, including what number of cars and drivers Tesla has authorized to use it. The file was added to the agency’s website on Tuesday and Bloomberg first reported on it.
Within the letter and special order, the agency’s acting chief counsel John Donaldson wrote:
“NHTSA is worried in regards to the safety impacts of recent changes to Tesla’s driver monitoring system. This concern relies on available information suggesting that it might be possible for vehicle owners to change Autopilot’s driver monitoring configurations to allow the motive force to operate the vehicle in Autopilot for prolonged periods without Autopilot prompting the motive force to apply torque to the steering wheel.”
Tesla was given a deadline of Aug. 25 to furbish all the data demanded by the agency, and replied on time but they requested and their response has been granted confidential treatment by NHTSA. The corporate didn’t immediately respond to CNBC’s request for comment.
Automotive safety researcher and Carnegie Mellon University associate professor of computer engineering Philip Koopman told CNBC after the order was made public, “Evidently NHTSA takes a dim view of cheat codes that allow disabling safety features comparable to driver monitoring. I agree. Hidden features that degrade safety haven’t any place in production software.”
Koopman also noted that NHTSA has yet to complete a series of investigations into crashes where Tesla Autopilot systems were a possible contributing factor including, a string of “fatal truck under-run crashes” and collisions involving Tesla vehicles that hit stationary first responder vehicles. NHTSA acting administrator Ann Carlson has suggested in recent press interviews that a conclusion is near.
For years, Tesla has told regulators including NHTSA and the California DMV that its driver assistance systems including FSD Beta are only “level 2” and don’t make their cars autonomous, despite marketing them under brand names that might confuse the problem. Tesla CEO Elon Musk who also owns and runs the social network X, formerly Twitter, often implies Tesla vehicles are self-driving.
Over the weekend, Musk livestreamed a test drive in a Tesla equipped with a still-in-development version of the corporate’s FSD software (v. 12) on the social platform. During that demo, Musk streamed using a mobile device he held while driving and chatting along with his passenger, Tesla’s head of Autopilot software engineering Ashok Elluswamy.
Within the blurry video stream, Musk didn’t show all the small print of his touchscreen or reveal that he had his hands on the steering yoke ready to take over the driving task any moment. At times, he clearly had no hands on the yoke.
His use of Tesla’s systems would likely comprise a violation of the corporate’s own terms of use for Autopilot, FSD and FSD Beta, according to Greg Lindsay, an Urban Tech fellow at Cornell. He told CNBC, the complete drive was like “waving a red flag in front of NHTSA.”
Tesla’s website cautions drivers, in a bit titled “Using Autopilot, Enhanced Autopilot and Full Self-Driving Capability” that “it’s your responsibility to stay alert, keep your hands on the steering wheel in any respect times and maintain control of your automotive.”
Grep VC managing partner Bruno Bowden, a machine learning expert and investor in autonomous vehicle startup Wayve, said the demo showed Tesla is making some improvements to its technology, but still has a good distance to go before it will probably offer a secure, self-driving system.
Throughout the drive, he observed, the Tesla system nearly blew through a red light, requiring an intervention by Musk who managed to brake in time to avoid any danger.