Autopilot turns off because the car doesn’t know what to do and the driver is supposed to take control of the situation. The autopilot isn’t autopilot, it’s driving assistance and you want it to turn off if it doesn’t know what it’s should do.
Sure, what meant though was that Tesla doesn’t have self driving cars the way they try to market it as. They are no different than what other car manufacturers got, they just use a more deceptive name.
If an incident is imminent within the next <2 seconds or so, autopilot must take the action or assist in an action. Manual override can happen at any time, but in such a duration it’s unlikely and only the autopilot has any chance, therefore it cannot turn off and absolve itself if liability.
It seems reasonable for the autopilot to turn off just before collission, my point was more in the line of “You won’t get a penny from Elon”.
People who rely on Full Self Driving or whatever it’s called now, should be liable for letting a robot control their cars. And I also think that the company that develops and advertises said robot shouldn’t get off scot-free but it’s easier to blame the shooter rather than the gun manufacturer.
The autopilot will turn off just before hitting them to make you liable anyway
That’s only if you didn’t subscribe to the Ludicrous package.
Nah even then. Ain’t no way Tesla admits fault for anything
Until they go the way of PayPal, at least. Musk’s exit plan is Mars, remember?
Can we please speed up his exit plan?
Only do get actual slaves there lol
lol read Stranger in a Strange Land if you want an interesting Mars story
It actually does. Teslas are great.
Autopilot turns off because the car doesn’t know what to do and the driver is supposed to take control of the situation. The autopilot isn’t autopilot, it’s driving assistance and you want it to turn off if it doesn’t know what it’s should do.
Autopilot also turns off on planes when things go wrong.
Sure, what meant though was that Tesla doesn’t have self driving cars the way they try to market it as. They are no different than what other car manufacturers got, they just use a more deceptive name.
If an incident is imminent within the next <2 seconds or so, autopilot must take the action or assist in an action. Manual override can happen at any time, but in such a duration it’s unlikely and only the autopilot has any chance, therefore it cannot turn off and absolve itself if liability.
Autopilot turns off before collision because physical damage can cause unpredictable effects that could cause another accident.
Let’s say you run into a wall, autopilot is broken, the car thinks it needs to go backwards. You now killed 3 more people.
I hate Elon Musk and Teslas are bad, but let’s not spread misinformation.
It seems reasonable for the autopilot to turn off just before collission, my point was more in the line of “You won’t get a penny from Elon”.
People who rely on Full Self Driving or whatever it’s called now, should be liable for letting a robot control their cars. And I also think that the company that develops and advertises said robot shouldn’t get off scot-free but it’s easier to blame the shooter rather than the gun manufacturer.
Yeah I agree. Both parties should be liable. Tesla for their misleading and dangerous marketing, drivers for believing in the marketing.