NHTSA investigated the accident and confirmed that the vehicle was using Autopilot at the time of the crash. However, according to phone data, it blamed the driver, who was playing a video game on his phone, and the lack of a crash attenuator, which affected the severity of the crash.
When using Autopilot or FSD Beta, Tesla tells drivers that they need to pay attention at all times and to be ready to take control at all times. If drivers are not doing that, they are misusing the system.
The family has sued Tesla for wrongful death, and it is going to be quite an uphill battle for them because it looks like he was using his phone while driving, which is a traffic violation and against Tesla’s guidance on how to use Autopilot.
That said, the family’s lawyers benefit from learning from previous similar trials and they are taking a different approach. They are not denying Huang’s misuse of Autopilot, but they are focusing on Tesla’s communications, which they claim led to the driver misusing Autopilot.
Tesla definitely could/should have picked more appropriate names. I don’t think it should be disallowed entirely, as human + computer is already safer than human only, as long as the humans pays attention. Computers will eventually surpass human skill, and ADAS technology is a stepping stone towards that.
au·ton·o·mous /ôˈtänəməs/ adjective
undertaken or carried on without outside control : self-contained