an AI resume screener had been trained on CVs of employees already at the firm, giving people extra marks if they listed “baseball” or “basketball” – hobbies that were linked to more successful staff, often men. Those who mentioned “softball” – typically women – were downgraded.

Marginalised groups often “fall through the cracks, because they have different hobbies, they went to different schools”

  • @[email protected]
    link
    fedilink
    78
    edit-2
    10 months ago

    This is not any kind of modern “AI”. This is a fancy version of “key word filtering”. It’s been done for decades. Why, tech writers, why must you not use your brains when writing these articles? … We aren’t going to believe a word you write if you can’t get basic facts figured out.

    • @[email protected]
      link
      fedilink
      English
      3610 months ago

      Ah, but the AI part comes in not knowing what the keywords are because it’s all mangled into some neural network soup.

    • @jj4211
      link
      English
      1210 months ago

      They use their brains just fine. They know AI is clickbait gold, and That’s all that matters.

      A few well informed people get turned off by it? Who cares, they got a big chunk of readers from news aggregators.

    • @[email protected]
      link
      fedilink
      English
      6
      edit-2
      10 months ago

      No, it’s pretty clear that this is a result of modern “AI”… key word filtering wouldn’t push applicants mentioning basketball/baseball up and softball down, unless HR is explicitly being sexist and classiest/racist like that.

      I mean, the problem has existed for sure before ML & AI was being used, but this is pretty clearly the result of an improperly advised/trained dataset which is very different from key word filtering. I don’t think HR a decade ago was giving/deducting extra points on applicants for resumes for mentioning sports/hobbies irrelevant to the job