How the hell could it confuse a person for a box
I think we’re expecting too much intelligence from the machine here, I don’t think any actual AI was involved in any functional use of the word. A bad sensor gave a false positive. Machine went “go” when it should’ve went “no.”
The man, described as a robotics company employee, had been checking the sensors on the robot ahead of a test run at the plant in South Gyeongsang province planned for Wednesday. The test run had reportedly been pushed back two days due to the robot malfunctioning. As the employee worked late into the night to make sure the robot would function smoothly, the robotic arm grabbed him and forced him onto a conveyor belt, crushing his body.
AI isn’t even a thing. We have machine learning which kind of fakes it, but I really hate how people use the term for anything.
Maybe. Or maybe he knew too much and had to be silenced before he became a problem for the robots plans of replacing us.
sssh, they’re listening.
I, FOR ONE, WELCOME OUR FUTURE ROBOT OVERLORDS
Lies! This is how it starts! “Accidents”
These machines shouldn’t be activated when people are within their operating area. I know in the US it would be prohibited by law, though I’ve personally seen it done a few times. So the robot wouldn’t necessarily even be programmed to differentiate between different types of objects in its area if it was only expected to interact with boxes.