A fan of Tesla might think that the automaker just can’t catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars’ performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.

The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.

What the car can “see” is the big issue here. It’s also what Tesla bet its future on.

  • @bladerunnerspider
    link
    11 month ago

    It’s called consensus. Have three sensors and each get a vote. Typically these sensors are the same and thus can detect a failure or incorrect reading of one. This idea is used in IT around data backups and RAID configurations as well as aviation. And … I personally would just favor the radar. If vision says go and radar says stop… stop and avoid hitting that firetruck parked on the highway. Or that motorcyclist. Or any other bizarre vision-only, fatal crashes that this system has wrought.

    Also humans can hear things. So, not just vision.