A lawsuit filed by more victims of the sex trafficking operation claims that Pornhub’s moderation staff ignored reports of their abuse videos.


Sixty-one additional women are suing Pornhub’s parent company, claiming that the company failed to take down videos of their abuse as part of the sex trafficking operation Girls Do Porn. They’re suing the company and its sites for sex trafficking, racketeering, conspiracy to commit racketeering, and human trafficking.

The complaint, filed on Tuesday, includes what it claims are internal emails obtained by the plaintiffs, represented by Holm Law Group, between Pornhub moderation staff. The emails allegedly show that Pornhub had only one moderator to review 700,000 potentially abusive videos, and that the company intentionally ignored repeated reports from victims in those videos.

The damages and restitution they seek amounts to more than $311,100,000. They demand a jury trial, and seek damages of $5 million per plaintiff, as well as restitution for all the money Aylo, the new name for Pornhub’s parent company, earned “marketing, selling and exploiting Plaintiffs’ videos in an amount that exceeds one hundred thousand dollars for each plaintiff.”

The plaintiffs are 61 more unnamed “Jane Doe” victims of Girls Do Porn, adding to the 60 that sued Pornhub in 2020 for similar claims.
Girls Do Porn was a federally-convicted sex trafficking ring that coerced young women into filming pornographic videos under the pretense of “modeling” gigs. In some cases, the women were violently abused. The operators told them that the videos would never appear online, so that their home communities wouldn’t find out, but they uploaded the footage to sites like Pornhub, where the videos went viral—and in many instances, destroyed their lives. Girls Do Porn was an official Pornhub content partner, with its videos frequently appearing on the front page, where they gathered millions of views.

read more: https://www.404media.co/girls-do-porn-victims-sue-pornhub-for-300-million/

archive: https://archive.ph/zQWt3#selection-593.0-609.599

  • Ragdoll X
    link
    English
    4
    edit-2
    8 months ago

    AI image generators don’t really lead to centralization - quite the contrary in fact. While there are your DALL-Es and ChatGPTs behind closed doors, there’s also Stable Diffusion and its many variants, along with various open-source Large Language Models and several other projects from hobbyist developers. I’ve seen a lot of people make and post their own AI-generated porn with Stable Diffusion, and some who make money out of it. So while some porn actors/actresses may lose their jobs because of AI, this technology is also creating opportunities for other people.

    And the same can be argued about any kind of automation, so how far should we go with this idea? Should mechanical looms be banned to bring back manual weaving jobs? Should automated filters on social media be removed to create more jobs for content moderators?

    I don’t think AI/automation is the problem. A world where most jobs are automated isn’t a bad thing - a world where money takes precedent over humans and people are punished if they’re out of work (i.e. capitalism) is.

    • @TwilightVulpine
      link
      English
      6
      edit-2
      8 months ago

      Should automated filters on social media be removed to create more jobs for content moderators?

      Maybe not removed but we absolutely need many more people moderating online platforms. We have just so many problems from automated content moderation systems that are caused by the lack of humans reviewing content. Including this very situation, where the site let a lot of sex abuse material in.

      I don’t think AI/automation is the problem. A world where most jobs are automated isn’t a bad thing. A world where money takes precedent over humans and people are punished if they’re out of work - i.e. capitalism - is.

      Yes, but consistently advances in automation come with promises of better lives for people that do not materialize. There have been decades that people talk that we have means to make it so everyone can work less hours a day and less day a week, instead people get fired and we have even less people employed, overworked beyond the limits that worker movements had achieved before.

      Will AI really help people or will it just make it even harder for the people who do willing sex work? Given how twisted this industry is, maybe a little of both, it could turn out to be a net positive, though it’s hard to judge that. But other fields are probably only going to get the hardship.

      Lets be honest, the whole point of automation is to do more work than what it replaces, so it never creates as many jobs as it takes away. Even worse, AI in particular is already primed to replace the same tech, service and artistic jobs that previous forms of automation freed us to engage with. We will not get the same amount of jobs from AI.

      What then? Back to sweatshops, to try to undercut the automation we can’t outperform? We can’t keep at this “oh well, Capitalism still didn’t change ¯\_(ツ)_/¯”.

    • @thenightisdark
      link
      English
      68 months ago

      I will say that unlike the horse and buggymakers or the barrel makers or the candlestick makers who have all lost their jobs I do admit…

      None of those are as inherently human as sexuality is.

      Capitalism makes a great cell phone. Capitalism is terrible when it takes precedent over humans and people.

      • @Cryophilia
        link
        English
        78 months ago

        My hope is that this will kill off the makeup-crusted dead-eyed fake moan human doll bullshit that is mainstream porn.

        AI can’t fake all the randomness and idiosynchracy of two real people having real sex. Maybe that’s what human porn will coalesce around.