‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.

  • @dreadedsemi
    link
    English
    1211 year ago

    Couldn’t they hire from watchpeopledie or nothingtoxic or ebaum. Those users probably would do overtime for free.

    • @[email protected]
      link
      fedilink
      English
      871 year ago

      People that are completely desensitized to that kind of stuff would probably not be very good at moderating it really.

      Also this is a terrible job and I’d be very worried if a company was paying and enabling people who find that fun. It’s horrible, but trauma is the normal outcome.

      • @WhatAmLemmy
        link
        English
        441 year ago

        Sounds like the perfect job for AI

          • @[email protected]
            link
            fedilink
            English
            181 year ago

            Maybe they still have the content that got removed because of that, you might be able to train an AI just on that. That way they don’t need to manually check it, it’s already been done after all.

            • @[email protected]
              link
              fedilink
              English
              13
              edit-2
              1 year ago

              Exactly, and even if the content uploaded disagrees and request human oversight that is just one image that needs to be checked rather then all. Ai may even be able to blur parts of footage that are most brutal and extreme and create written transcripts of audio You dont need 4k resolution and hearable screaming to understand that someone is getting murderered or Raped.

              • @[email protected]
                link
                fedilink
                English
                11 year ago

                True, didn’t think about that. Blurring and transcripts are also way more reliable, so it will work correctly almost always.

        • @nodsocket
          link
          English
          14
          edit-2
          10 months ago

          deleted by creator

        • @[email protected]
          link
          fedilink
          English
          101 year ago

          I am of the kind that is very wary with what should or should not be an AI’s job, and you know what, in this very particular case, I think I agree.

          At least as a first filter, anyway.

        • @[email protected]
          link
          fedilink
          English
          41 year ago

          Huge industries emerging in this field right now for everything from this type of social media moderation to helping fight CSAM more effectively so humans aren’t having to be a frontline for that type of material. This is one area I can really, really get behind AI on and see a very valid use case that isn’t just marketing hype like so many others. I know there’s some great stuff happening just based on my own field of employment and being close to a few things in the works this year.

      • fadingembers
        link
        fedilink
        English
        201 year ago

        Honestly I don’t see an issue with it. If they can tell the difference between an image that should be moderated and one that shouldn’t they can do the job and I seriously doubt the vast majority of people desensitized to that kind of content can’t tell the difference. That’s like the arguments that we shouldn’t make graphic games or movies because people won’t be able to tell the difference between them and reality. Not everyone can do every job and these people would be the perfect fit for it and we would spare others from getting hurt

        • @[email protected]
          link
          fedilink
          English
          151 year ago

          Desensitized doesn’t necessarily mean somebody doesn’t have reactions to something. It just means they can compartmentalize those reactions and move forward and deal with the ramifications later.

          EMTs, ER Doctors, and Nurses are largely desensitized to graphic trauma and can press through and get the job done. But that doesn’t mean that they don’t process those scenes later in both healthy and unhealthy ways (there’s a few study out there that show ER staff have higher rates of alcoholism and substance abuse rates than the general public).

          Tramua is trauma, whether you’re desensitized or not.

        • @[email protected]
          link
          fedilink
          English
          6
          edit-2
          1 year ago

          It would be a highly unethical but interesting research to see if those people experience long-term consequences nevertheless. Or if being desensitizes really does give someone immunity.

        • @[email protected]
          link
          fedilink
          English
          51 year ago

          Except, you know, we’re talking people who are progressively desensitized to reality. So no, that’s not comparable at all.

        • @Mr_Dr_Oink
          link
          English
          21 year ago

          Exactly. If they couldn’t tell the difference, then how could they know which content to seek out for their own enjoyment. It might not affect them much, if at all anymore, but they know what ‘it’ looks like.

          Can you imagine them watching a cute cat video over and over and wondering why they aren’t getting the rush they must feel when watching gore.

          I remember in the early days of the internet, i clicked a link on a forum and ended up watching a video of some guy being decapitated. I have never forgotten that image, 20+ years later, and i know i would be checking into a mental hospital if i had the job these facebook staff have had to do. But there are people who like this sort of stuff, and its not because they have forgotten what decapitation looks like.

      • @dreadedsemi
        link
        English
        21 year ago

        I’ve seen users laugh at horrific gore videos on some forums. I’m not sick, but was curious at one point and googled.

  • @kayrae_42
    link
    English
    451 year ago

    Secondary trauma is very real. I’ve done freelance work around focus groups for some traumatic things and the person I worked with on it made sure that I had proper support for it. That I took time to process the disturbing things. I don’t understand why when they obviously found a video that violated guidelines that they had to watch the entire thing if they weren’t going to give them proper psychological supports or processing time.

    Jobs that involve this type of imagery are traumatic and should be treated as such. Extra vacation time, proper psychological support not just “your doing import work” but actual trauma processing work. But when has a large company like Meta cared about its employees?

    • @SoleInvictus
      link
      English
      71 year ago

      But when has a large company like Meta cared about its employees?

      You pretty much nailed it. A company like Meta would never fail to maximize their profit, even if the result is detrimental to employees and/or users. Facebook is demonstrably detrimental to society in general, yet they don’t care - gotta profit more and more for the stockholders, consequences be damned.

  • ijeff
    link
    fedilink
    English
    411 year ago

    I’ve heard this is also the case for civilians and officers working for police departments who are responsible for handling evidence related to child abuse. Takes a lot of psychological support, which I’m not sure can ever be enough.

    • @Gradually_Adjusting
      link
      English
      221 year ago

      Not every job is something humans are cut out for. This is a job AI should be taking off our plates.

    • gregorum
      link
      fedilink
      English
      121 year ago

      Many places like that have protocols for how long your shifts can be and how long you can do it, with constant psych support while you do it in order to reduce and mitigate the impact of the material. Meta may have been pushing their workers too hard and cutting corners.

    • @[email protected]
      link
      fedilink
      English
      261 year ago

      Both lawyers agree that Meta’s policy of forcing employees to watch the entire video in order to explain all the reasons for censorship aggravates the trauma.

      It’s in the article.

    • Natanael
      link
      fedilink
      English
      31 year ago

      It’s trivial to circumvent automatic detection

    • @[email protected]
      link
      fedilink
      English
      31 year ago

      The EU now has a rule that all reports of content must be checked and verified for illegal content like misinformation. They can’t automatically block that content because then people would weaponise reports. At best they can automatically block video and image hashes which have been previously verified as illegal, but these are trivial to circumvent. I think they’ve started using perceptual hashes but these are far from perfect.

      I believe they use similar moderation for the US to proactively head off potentially similar legislation to the EU.

      Something like 3 billion people actively use Facebook each month. There must be tens of millions of daily reports. I can only imagine the level of planning, staffing, and tools which are required to facilitate that.

  • AutoTL;DRB
    link
    fedilink
    English
    151 year ago

    This is the best summary I could come up with:


    More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.

    More than 20% of the staff of CCC Barcelona Digital Services - owned by Telsus, the company that Meta hired to check the content of Facebook and Instagram, are on sick leave due to psychological trauma.

    The images posted on the social networks they were supposed to check showed the worst of humanity: videos of murders, dismemberments, rapes and live suicides.

    He sticks a knife in its chest, rips out its heart and eats it," Francesc Feliu, lawyer for more than a dozen workers who decided to sue the company, told Euronews.

    The psychologist would listen to them and then tell them that what they were doing was extremely important for society, that they had to imagine that what they were seeing was not real but a film, and that they should go back to work," says the Spanish lawyer.

    Both lawyers agree that Meta’s policy of forcing employees to watch the entire video in order to explain all the reasons for censorship aggravates the trauma.


    The original article contains 986 words, the summary contains 191 words. Saved 81%. I’m a bot and I’m open source!

  • @wildcardology
    link
    English
    -91 year ago

    Should there be a disclaimer before hiring? I mean don’t apply for the job if it’s going to bother you. I sure hope they were not forced by the boss to suck it up.

    I admit I’m a bit desensitized to gore but I don’t know if I can handle truly brutal videos.

    • @braxy29
      link
      English
      4
      edit-2
      1 year ago

      probably there is a disclaimer of some sort, and people think they can handle it. but also, even people who worry they can’t may be desperate for a job.

      i work with people who have experienced terrible things, but it’s a little easier to manage because i am hearing about it (only to the extent that people want to talk) and not seeing it, so i get a little psychological distance. my environment is very supportive and i’m hearing things sometimes, not watching it hundreds of times in a day. even then, people in my role are at decent risk of burnout.

      like, you never know when you’re going to hear a thing that just gets to you.

      doing what these mods do for hours and hours a week without support seems like a recipe for secondary trauma and burnout. edit - and yes, i’m pretty sure their environment amounts to “suck it up and keep going.”

      • @wildcardology
        link
        English
        21 year ago

        I guess it’s the same when joining the military, no amount of video games or war movies will prepare you for the horrors of war.

  • Doctor xNo
    link
    fedilink
    English
    -551 year ago

    Pretty sure that if you get to see a lot of “Violent Content” on your Facebook, you’re either following the wrong groups or have wrong friends who you chose to allow to (or you’re specifically looking it up 😅). If there’s anything Facebook is good at, it’s at keeping people cosy in their selective information bubble. I myself have maybe had 5 posts that actually had shocking content I rather not had seen in the 14 years I was on Facebook, and I think most of them by the same 1 person that I eventually asked to exclude me if they post stuff alike… 😅

    Not gonna argue that visual shock trauma isn’t a thing, it surely is! (I have sadly collected a small (mind-)museum of those in my lifetime myself.) Nor am I gonna claim Facebook “isn’t that bad”, it definitely is too! But this just has moneyfishing written all over it.

    👴🏼 Oh, well,… Back in my day, when we saw something that would scar us for life, we’d immediately call as many friends possible to come look at it too… 👴🏼 😅

    • @[email protected]
      link
      fedilink
      English
      281 year ago

      This is about the people who have to check if this stuff is violating the rules, not users who happened to see it

      • Doctor xNo
        link
        fedilink
        English
        -21 year ago

        In my defense, I was pretty drunk when I commented this…

        Don’t drink and Lemmy… 😅

    • Zoidsberg
      link
      fedilink
      English
      171 year ago

      Did you read the article or just skim the headline?

      • Doctor xNo
        link
        fedilink
        English
        -21 year ago

        No, I read the headline and skimmed the article. 😅

    • @nodsocket
      link
      English
      -20
      edit-2
      10 months ago

      deleted by creator

      • @SomeSphinx
        link
        English
        31 year ago

        Idk why this was downvoted, this is pretty obvious sarcasm.

        • @nodsocket
          link
          English
          3
          edit-2
          10 months ago

          deleted by creator