People worry about giving robots with ai guns. Where are such outspoken people with statistics when we equip a four-wheeled sedan that weighs the same as a building that can accelerate faster than you can think the ability to control itself? Maybe you can have self driving guns too, does it make it ethically better to just skip the robot?
Also: who the hell am I supposed to take to court after getting maimed by one of these? Technically the operator isnt at fault if the autonomous system chooses to hit me instead of something else. Im certain the insurance companies have already found a weaselly loophole to avoid contingency in this case. I feel like we are gonna have to get back into the legislative idea of making manufacturers responsible for their product failures.
People worry about giving robots with ai guns. Where are such outspoken people with statistics when we equip a four-wheeled sedan that weighs the same as a building that can accelerate faster than you can think the ability to control itself? Maybe you can have self driving guns too, does it make it ethically better to just skip the robot?
Also: who the hell am I supposed to take to court after getting maimed by one of these? Technically the operator isnt at fault if the autonomous system chooses to hit me instead of something else. Im certain the insurance companies have already found a weaselly loophole to avoid contingency in this case. I feel like we are gonna have to get back into the legislative idea of making manufacturers responsible for their product failures.