Tesla is again under investigation for its Full Self-Driving software, this time after regulators found a connection between crashes and low-visibility conditions. Four collisions have been recorded, including one fatal crash, in which the Tesla EVs entered an area of reduced visibility with FSD engaged. The investigation might hinder Tesla's efforts to start testing unsupervised FSD in preparation for robotaxi deployment.
Tesla's automated driving features, such as Autopilot and Full Self-Driving (FSD), have periodically come into the spotlight as Tesla EVs have been involved in car accidents. Software involvement has sometimes been proven during the investigation, but it was often just lousy drivers trying to deflect the blame for their mistakes. Even when the cars are driving with FSD engaged, Tesla still makes it clear to owners that the vehicle does not drive autonomously, and the driver bears the entire responsibility when something bad happens.
However, this doesn't mean regulators don't need to investigate the cases involving Tesla FSD, especially when this involvement is not clear from the beginning. Such investigations have happened in the past, although, most of the time, they ended by establishing that the driver was to blame, either because they were distracted or (in many cases) intoxicated. Other times, the National Highway Traffic Safety Administration (NHTSA) issued recommendations to Tesla on how to make FSD use safer for everybody.
The NHTSA has recently opened a new investigation into Tesla's FSD software after four collisions, including one fatal crash, have been reported. In all cases, the FSD software appeared to be active at the time of the crash, and the car entered an area with low-visibility conditions. According to the Office of Defect Investigation (ODI), the fatal crash was with a pedestrian. One additional crash in these conditions involved a reported injury.
The investigation aims to assess FSD's ability to detect and respond appropriately to reduced visibility conditions. These include sun glare, fog, and airborne dust, which can cause Tesla cameras to lose efficiency. The preliminary probe will cover 2.4 million Tesla EVs, basically all FSD-capable vehicles on the US roads. These include 2016-2024 Model S and Model X, 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck.
The ODI investigation also tries to find whether any similar FSD crashes have occurred in the past in reduced roadway visibility conditions. Intriguingly, ODI also wants to find out if Tesla made any modifications to the FSD software lately that might have affected its performance in low-visibility conditions. It also wants to know how Tesla assesses the safety impact of each FSD update before rolling it out to customers.
Tesla owners have frequently complained about "phantom braking," in which their EVs driving on FSD braked suddenly for no apparent reason. In some cases, it was believed the sudden braking was caused by sun glare or other conditions that impaired Tesla's Vision-only FSD system. Could it be that Tesla tried to avoid phantom braking and changed FSD behavior to ignore brake impulses in such situations? The NHTSA investigation should find out.
What's certain is that the investigation will hinder Tesla's efforts to start Cybercab unsupervised FSD tests next year. Tesla recently announced its plans to deploy driverless robotaxis in Texas and California in 2025. Tesla insisted that its Vision-only approach is better than that of other autonomous driving companies. However, weather conditions can impact cameras' ability to see things.
However, this doesn't mean regulators don't need to investigate the cases involving Tesla FSD, especially when this involvement is not clear from the beginning. Such investigations have happened in the past, although, most of the time, they ended by establishing that the driver was to blame, either because they were distracted or (in many cases) intoxicated. Other times, the National Highway Traffic Safety Administration (NHTSA) issued recommendations to Tesla on how to make FSD use safer for everybody.
The NHTSA has recently opened a new investigation into Tesla's FSD software after four collisions, including one fatal crash, have been reported. In all cases, the FSD software appeared to be active at the time of the crash, and the car entered an area with low-visibility conditions. According to the Office of Defect Investigation (ODI), the fatal crash was with a pedestrian. One additional crash in these conditions involved a reported injury.
The investigation aims to assess FSD's ability to detect and respond appropriately to reduced visibility conditions. These include sun glare, fog, and airborne dust, which can cause Tesla cameras to lose efficiency. The preliminary probe will cover 2.4 million Tesla EVs, basically all FSD-capable vehicles on the US roads. These include 2016-2024 Model S and Model X, 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck.
The ODI investigation also tries to find whether any similar FSD crashes have occurred in the past in reduced roadway visibility conditions. Intriguingly, ODI also wants to find out if Tesla made any modifications to the FSD software lately that might have affected its performance in low-visibility conditions. It also wants to know how Tesla assesses the safety impact of each FSD update before rolling it out to customers.
Tesla owners have frequently complained about "phantom braking," in which their EVs driving on FSD braked suddenly for no apparent reason. In some cases, it was believed the sudden braking was caused by sun glare or other conditions that impaired Tesla's Vision-only FSD system. Could it be that Tesla tried to avoid phantom braking and changed FSD behavior to ignore brake impulses in such situations? The NHTSA investigation should find out.
What's certain is that the investigation will hinder Tesla's efforts to start Cybercab unsupervised FSD tests next year. Tesla recently announced its plans to deploy driverless robotaxis in Texas and California in 2025. Tesla insisted that its Vision-only approach is better than that of other autonomous driving companies. However, weather conditions can impact cameras' ability to see things.