In every reported case where police mistakenly arrested someone using facial recognition, that person has been Black::Facial recognition software has always had trouble telling Black people apart, yet police departments are still using it.

  • @[email protected]
    link
    fedilink
    English
    351 year ago

    This isn’t new. It’s been a known problem for a long time, because facial recognition software is trained using white people. So it gets really really good at differentiating between white people. But with black people as a tiny fraction of the sample data, it basically just learns to differentiate them with broad strokes. It’s good at telling them apart from white people, but not much else.

    • @[email protected]
      link
      fedilink
      English
      34
      edit-2
      1 year ago

      It’s not just a training issue. Lighter (color) tones reflect. Dark tones absorb. There have been lots of issues with cameras or even just sensors having issues with people having dark skin tones because the lower reflectivity/contrast of dark tones. 3D scanners - even current models - have similar issues with objects having black parts for similar reasons. Training on more models can help, but there’s still an overall technical difficulty to overcome as well (which is also a good reason that using facial recognition in this manner is just bullshit, period).

      • @[email protected]
        link
        fedilink
        English
        41 year ago

        As a technological problem it could have a technological partial solution: the darker the skin, the higher the threshold to declare a match. This would also mean more false negatives (real matches not caught by the software) but not much to do about that.

    • @Chunk
      link
      English
      61 year ago

      I’m interested what dataset they’re using because simply adding more black people to the training set seems like a pretty straightforward fix.

      • @elephantium
        link
        English
        21 year ago

        It seems like past mugshots would be an ideal part of the training set. Are they not using those?

        Personally, I’m leaning towards “It’s not the image recognition program that’s the problem here.”

  • @[email protected]
    link
    fedilink
    English
    221 year ago

    Isn’t face recognition just going to be inherently rly less reliable on darker skinned people? Their features would certainly have less contrast on darker skin, no?

    • @charles
      link
      English
      51 year ago

      Why not expose pictures longer to get better features of darker skinned people and less accurate of lighter skinned people, leading to more false arrests of lighter skinned people?

      • @SlopppyEngineer
        link
        English
        5
        edit-2
        1 year ago

        And remove the convenient excuse to harass black people? These are cops we’re talking about.

    • @firadin
      link
      English
      41 year ago

      Doesn’t the fact that a technology is fundamentally discriminatory mean we should question the use of that technology? Not just shrug our shoulders and say too bad?

      • Throwaway
        link
        fedilink
        English
        11 year ago

        Shouldn’t use it at all, but the tech isn’t intentionally malicious, just a fact of the tech.

        • @charles
          link
          English
          11 year ago

          Exposure is a dial, not a technology. People are choosing to use it this way.

          • Throwaway
            link
            fedilink
            English
            11 year ago

            Hell of a lot easier when its a white dude on a dark background than a black dude on a dark background. The technology is just not being used in the way it should be.

    • @DoomBot5
      link
      English
      31 year ago

      If white, politely ask

  • @Ensign_Crab
    link
    English
    41 year ago

    So it’s acting like real cops and lying about black people “fitting the description.”