The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

  • @Buddahriffic
    link
    English
    13 hours ago

    Not only that, when we have trouble seeing things, we can adjust our speed to compensate (though tbf, not all human drivers do, but I don’t think FSD should be modelled after the worst of human drivers). Does Tesla’s FSD go into a “drive slower” mode when it gets less certain about what it sees? Or does its algorithms always treat its best guess with high confidence?