Tesla faces NHTSA investigation of ‘Full Self-Driving’ after fatal collision
Tesla faces a new investigation by the National Highway Traffic Safety Administration, or NHTSA, concerning issues with its "Full Self-Driving" systems, and whether they are safe to use in fog, glaring sun or other "reduced roadway visibility conditions."
The probe follows an incident in which a Tesla driver who had been using FSD, struck and killed a pedestrian, and other FSD-involved collisions during reduced roadway visibility conditions.
Records posted to the NHTSA website on Friday morning said the purpose of the new probe would be to assess:
"The ability of FSD's engineering controls to detect and respond appropriately to reduced roadway visibility conditions; whether any other similar FSD crashes have occurred in reduced roadway visibility conditions and, if so, the contributing circumstances for those crashes," among other things.
The agency will also look into Tesla's over-the-air, software updates to its FSD systems, which are now marketed as "Full Self-Driving (Supervised)," to understand the "timing, purpose, and capabilities of any such updates, as well as Tesla's assessment of their safety impact."
The "preliminary evaluation" by the NHTSA pertains to a vehicle population of around 2.4 million Tesla EVs on U.S. roads including: Model S and X vehicles produced from 2016 to 2024, Model 3 vehicles produced from 2017 to 2024, Model Y vehicles produced from 2020 to 2024, and Cybertruck vehicles produced this year and last, which give drivers the option to use Tesla's FSD.
FSD, which the company now refers to as a "partial driving automation system," is Tesla's paid, premium driver assistance option. But Tesla has offered it to all drivers for a monthlong free trial in the U.S., previously.
The U.S. federal vehicle