• @SCB
    link
    21 year ago

    It’s not so much dystopian as it is just buggy software

    • kase
      link
      21 year ago

      Ah ok. I don’t know much about it, but I’ve heard that AI could sometimes be negative toward commonly discriminated against groups because the data that it’s trained with is. (Side note: is that true? someone pls correct me if it’s not). I jumped to the conclusion that this was the same thing. My bad

      • @adrian783
        link
        31 year ago

        what it did it expose just how much inherent bias there is in hiring. even just name and gender alone.

      • @SCB
        link
        31 year ago

        That is both true and pivotal to this story

        It’s a major hurdle in some uses of AI

      • TAG
        link
        11 year ago

        An AI is only as good as its training data. If the data is biased, then the AI will have the same bias. The fact that going to a women’s college was considered a negative (and not simply marked down as an education of unknown quality) is proof against the idea that many in the STEM field hold (myself included) that there is a lack of qualified female candidates but not an active bias against them.

    • @matter
      link
      11 year ago

      When buggy software is used by unreasonably powerful entities to practise (and defend) discrimination that’s dystopian…

      • @SCB
        link
        21 year ago

        Except it wasn’t actually launched, and they didn’t defend its discrimination but rather ended the project.