• @Gigan
    link
    English
    411 months ago

    How the hell could it confuse a person for a box

    • Decoy321
      link
      English
      7
      edit-2
      11 months ago

      I think we’re expecting too much intelligence from the machine here, I don’t think any actual AI was involved in any functional use of the word. A bad sensor gave a false positive. Machine went “go” when it should’ve went “no.”

      The man, described as a robotics company employee, had been checking the sensors on the robot ahead of a test run at the plant in South Gyeongsang province planned for Wednesday. The test run had reportedly been pushed back two days due to the robot malfunctioning. As the employee worked late into the night to make sure the robot would function smoothly, the robotic arm grabbed him and forced him onto a conveyor belt, crushing his body.

      • @SkyezOpen
        link
        English
        611 months ago

        AI isn’t even a thing. We have machine learning which kind of fakes it, but I really hate how people use the term for anything.

      • @Cruxifux
        link
        English
        211 months ago

        Maybe. Or maybe he knew too much and had to be silenced before he became a problem for the robots plans of replacing us.

        • Decoy321
          link
          English
          211 months ago

          sssh, they’re listening.

          I, FOR ONE, WELCOME OUR FUTURE ROBOT OVERLORDS

    • @MTK
      link
      English
      411 months ago

      Lies! This is how it starts! “Accidents”

    • MashedPotatoJeff
      link
      English
      311 months ago

      These machines shouldn’t be activated when people are within their operating area. I know in the US it would be prohibited by law, though I’ve personally seen it done a few times. So the robot wouldn’t necessarily even be programmed to differentiate between different types of objects in its area if it was only expected to interact with boxes.