More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism.

The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi.

The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.

  • @[email protected]
    link
    fedilink
    335 days ago

    Companies keep talking about replacing employees with AI yet they keep up this fuckery. Y’all’s AI models are either good enough to handle this shit or shouldn’t be used as a bad-faith bargaining chip. If there were ever a job that should be eliminated from human labor, NSFL content moderating seems like the perfect contender.

    • @YarHarSuperstar
      link
      95 days ago

      I have heard that folks from African countries who are hired to train those AI models are also reporting abuses. So imo that’s not really a solution either

    • DigitalDilemma
      link
      fedilink
      English
      15 days ago

      Rather a cycnical take here, but perhaps that’s what’s coming and these jobs are going to be made redundant shortly so they’re filing a claim while they still can.

    • @TrickDacy
      link
      15 days ago

      I’m pretty sure this is actually referring to work done by humans long before the “ai” fad

  • @[email protected]
    link
    fedilink
    English
    15
    edit-2
    5 days ago

    The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

    I tried to write different things about this but that shit speaks for itself, fuck this world.