- cross-posted to:
- fuckcars
- technology
- cross-posted to:
- fuckcars
- technology
Tesla driver who killed 2 people while using autopilot must pay $23,000 in restitution without having to serve any jail time::The case is believed to be the first time that U.S. prosecutors have brought felony charges against a motorist who was using a partially automated driving system.
Tesla has to get more punishment than NOTHING. This sucks
Yeah, judging by the article, Tesla should take some responsibility here. Not that the driver should get off, if your car is blowing a red light at 120km/h you’re just not paying proper attention.
Sure, I’d prefer to know more exactly the time between. Was it 2 seconds or 25? But my premise is this shouldn’t happen in the software. I know I read some time ago that Teslas had shut off the software moments before collision, no time to save it, but I’d have to double check that. All to blame the customer
Automakers should not be allowed to use the unsuspecting public as toys for their experimental software, it quickly becomes a 1-4 ton death machine, but I think we agree on that.
Oh yeah, I work in software development myself. No way I’d trust my life to something like Tesla’s autopilot, which is perpetually in beta, relies on just the camera feed and is basically run by a manager that has clear issues with over promising and under delivering (among other things). You can get away with shit like that for a website or mobile app, but these are people’s lives.
Sounds like a convenient excuse by the driver who ran a red light.
Suing Tesla seems a little dumb to me. Sue the DMV that’s giving people like this licenses
Auto manufacturers must be held liable for faulty software. If it’s not safe, it does not go on the road
Only if the software is causing the accident or preventing the driver from avoiding one. Here the fault of the software was to not slow down out of the highway (which by experience must be a very specific situation because it most certainly do), the drive could have disengage autopilot or applied brakes to stop at the red light. The software specifically mentions it can’t stop at red lights and alerts the driver when it’s about to burn one. 100% of fault is the driver here.
Are manufacturers solely responsible for safety, or lack thereof, on the public roads of the USA?
I don’t believe they are.
Solely? No. But if the airbag, seatbelt, or self-driving autopilot feature that they created contributed to someone’s death, they are partially responsible and should face consequences or punishments. Especially if they market it as a safe feature.
https://en.wikipedia.org/wiki/Section_230
Your point being?
They are legally protected from the punishment of more than anything. Nothing will ever hit them with more than NOTHING.
Then the law has failed, which was my point from the start.
Why would you try to belittle me by repeating my opinion? That doesn’t make sense at all.
Because unless you plan on becoming a lobbyist, or politician, or activist nothing will change. Sitting around saying “They have to get in trouble in some manner” doesn’t do anything. If you want that to happen, since they are legally protected from what you want, go make a change.
I don’t live in the US, but the first step towards change is getting mad and raising awareness, this is the change I can make.
Closer to home, I full heartedly support the strike on Tesla Sweden for example.