AND tweet from Elon Musk mentioning that Tesla may allow some owners who are testing its “fully autonomous driving” system to show off a warning reminding them to maintain their hands on the wheel has drawn the eye of US safety regulators.
The National Highway Traffic Safety Administration says it asked Tesla for more information on the tweet. Last week, the agency said the problem is now a part of a wider investigation into at the least 14 Teslas that collided with emergency vehicles while using the Autopilot driver assistance system.
As of 2021, Tesla is conducting “fully self-driving” beta testing with owners who haven’t been trained on the system but are actively monitored by the corporate. Earlier this 12 months, Tesla said 160,000, or about 15% of Tesla’s, were on US roads in attendance. Broader distribution of the software was to be implemented in late 2022.
Despite the name, Tesla still says on its website that cars cannot drive themselves. A Tesla using “fully autonomous driving” can drive itself on roads in lots of cases, but experts say the system could make mistakes.
“We’re not saying it’s quite ready for nobody to be behind the wheel,” CEO Musk said in October.
![Autopilot functions](https://nypost.com/wp-content/uploads/sites/2/2023/01/tesla-autopilot.jpg?w=1024)
On Latest 12 months’s Eve, one in every of Musk’s most die-hard fans tweeted that drivers who’ve logged greater than 10,000 miles in “fully self-driving” tests should give you the option to show off “steering nag,” a warning that tells drivers to maintain their hands on the wheel.
Musk replied, “That is right, the update will are available in January.”
It is not clear from the tweets exactly what Tesla will do. Nonetheless, disabling the driving force monitoring system in any vehicle that automates speed and steering would put other drivers on the road in danger, said Jake Fisher, senior director of automated testing at Consumer Reports.
“By utilizing the beta version of FSD, you are a part of the experiment,” said Fisher. “The problem is that other road users within the neighborhood haven’t volunteered to take part in this experiment.”
Tesla didn’t reply to a message asking for comment on the tweet or driver monitoring.
Automobile safety advocates and government investigators has long criticized Tesla’s monitoring system as inadequate. Three years ago, the National Transportation Safety Board cited poor monitoring as a contributing factor to Tesla’s fatal accident in California in 2018. The board really useful a greater system, but Tesla didn’t respond.
Tesla’s system measures torque on the steering wheel to be sure drivers are being attentive. Many Teslas have cameras that monitor the driving force’s eyesight. But Fisher says these cameras don’t work with infrared like some competition’s driver assistance systems, in order that they cannot see at night or if the driving force is wearing sunglasses.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, argued that Tesla contradicts itself in a way that may confuse drivers. “They struggle to make customers completely satisfied by taking their hands off the wheel, even when the (owner’s) manual says ‘don’t do it.’ “
![Tesla CEO Elon Musk](https://nypost.com/wp-content/uploads/sites/2/2023/01/elon-musk-tesla.jpg?w=1024)
Indeed, Tesla’s website says that Autopilot and its more sophisticated “fully self-driving” system are intended to be utilized by a “fully attentive driver who keeps his hands on the wheel and is able to take control at any time.” He says the systems are not fully autonomous.
The NHTSA noted within the records that there have been quite a few Tesla accidents where drivers had their hands on the steering wheel, but still didn’t listen to it. The agency said Autopilot is utilized in areas where its capabilities are limited and plenty of drivers fail to take motion to avoid accidents despite warnings from the vehicle.
Tesla’s semi-automated systems have been under investigation by the NHTSA since June 2016, when a driver using autopilot was killed after his Tesla ran under a tractor-trailer crossing his path in Florida. A separate Tesla probe that used autopilot when crashed into emergency vehicles launched in August 2021.
![The Tesla Model 3 is on autopilot.](https://nypost.com/wp-content/uploads/sites/2/2023/01/tesla-autopilot1.jpg?w=1024)
Including the Florida accident, NHTSA sent investigators to 35 Tesla accidents where automated systems are suspected. Nineteen people died in these accidents.
Consumer Reports tested Tesla’s monitoring system, which changes often with online software updates. Initially, the system didn’t warn the driving force without hands on the steering wheel for 3 minutes. Recently, nonetheless, warnings have are available in as little as 15 seconds. Nonetheless, Fisher said he wasn’t sure how long the driving force’s hands might be off the wheel before the system slowed down or shut down completely.
By disabling the “naked steering wheel,” Fisher said, Tesla could switch to a camera to observe drivers, but that is not clear.
Despite the names suggesting that autopilot and “full self-driving” can drive themselves, Fisher said it’s clear Tesla expects owners to proceed to be drivers. However the NTSB says human drivers can let their guard down and rely an excessive amount of on systems to look elsewhere or perform other tasks.
Those that enjoy “fully autonomous driving,” Fisher said, are prone to be more vigilant about taking control because the system makes mistakes.
“I would not dream of taking my hands off the wheel with this technique simply because it could actually do something unexpected,” he said.