Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world’s biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.

  • @[email protected]
    link
    fedilink
    English
    109
    edit-2
    1 year ago

    From my perspective deep fakes will lead to a short but massive peak of harassment until everyone is aware of the technology and its capabilities. Once the technology reaches the mainstream and everyone is able to generate such content with ease, people will just stop caring. If these videos are everywhere, it’s easy to play it off as a fake. It might even help victims of actual revenge porn. Virtual nudity will become less of a deal, probably even in real life.

    From my perspective the bigger issue of deep fakes is news. We already have a huge issue with lies on social media and even TV and newspapers today and once we can no longer trust what we see it will be incredibly hard to build up trust for any sources.

    Fake videos of politicians being spread to harm their credibility, fake videos of war crimes to justify an attack. Or vice versa if there’s an authentic video of a crime the offenders will just deny the authenticity. But in contrast to Trump’s “fake news” claims today, it will be more or less impossible for normal people to fake check anything.

    • Dudewitbow
      link
      fedilink
      English
      201 year ago

      Although not related to porn, a lot of scam services that operate in India already use it as a defense. Its extremely hard to get someone in the field in trouble because you need evidence to raid, and it cant be video nor audio because they claim that the said medium is a deepfake.

        • @[email protected]
          link
          fedilink
          English
          41 year ago

          It’s just a poor country thing. How are you gonna prioritise crime happening in other countries when you barely keep up at home.

          It’s kinda like when third world countries get criticized for poor women’s rights or LGBTQ rights, when a third of the country lives in absolute poverty. The former things are important, but the latter causes suffering on a whole different scale.

  • @CatZoomies
    link
    English
    76
    edit-2
    1 year ago

    This is a sad article to read. I’m not a woman nor am I young adult growing up with all this technology that can be leveraged against me. Could you imagine being a junior high or high school student, and having an anonymous classmate creating deepfake porn of you using your yearbook photo? And the children in your class gossiping about you, sharing your porn video/photo online with their friends, and enduring that harassment? It’s already well-documented what damages that too much pornography causes on our psychological development, now imagine the consumer of this content being around the victim. That harassment can get so much worse.

    I can’t even begin to fathom what kind of psychological damage this will cause to the youth. I feel for women everywhere - this is a terrible thing people are doing with this technology. I can’t imagine raising a daughter in this environment and trying to help her navigate this problem when some asshole creates deepfake porn of her. My niece is currently getting bullied in school - what if her bullies use these tools against her? This just makes my blood boil.

    It’s bad enough that since social media has risen and captured the attention span of kids and teenagers, that there is a well-defined correlation with an increase in suicide rates since 2009 (the year Twitter first came out). https://www.health.com/youth-suicide-rate-increase-cdc-report-7551663 . Now, a nonconsensual AI-generated porn era to navigate.

    These are dangerous times. This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take. Granted, outright bans never work (as the persistent ones will always get their hands on it), but we need to put controls into place to limit access to this. Then we can remediate the root cause to these problems (e.g., proper systemic education, teaching a modified sexual education in schools to address things like consent, etc.).

    EDIT:

    Wanted to also add after I posted this, that a common prevalent argument I hear parroted by people is this:

    • People are gonna do this AI generation anyway. It’ll get to the point that you won’t be able to tell what’s real or not, so women can just deny it. You can’t prove it’s real anyway, so why bother?

    This is another way of saying “boys will be boys” and ignoring the problem. The problem is harrassment and violence against women.

    • @[email protected]
      link
      fedilink
      English
      111 year ago

      Where did all the replies to this post go? There was an entire discussion that is now gone, and nothing in the modlog.

      • @[email protected]
        link
        fedilink
        English
        91 year ago

        After some testing, It might be that the parent commenter just deleted their comment which nuked all the child comments. I can’t rememeber if this is what Reddit does. I think it just sais “Deleted by creator”, but keeps the children. Could certainly be wrong, though.

        • SatansMaggotyCumFart
          link
          English
          71 year ago

          I’ve found that if done one deletes their comment than everything below it disappears.

          • @[email protected]
            link
            fedilink
            English
            7
            edit-2
            1 year ago

            Yup it appears that our entire comment chain got nuked. So it is now confirmed that if you delete the parent, then all children get removed as well.


            For any reading this message, the context is that we tested it by me replying to OP’s previous comment, then OP responding to me, then I deleted my comment to see if their comment also got deleted.

            • SatansMaggotyCumFart
              link
              English
              21 year ago

              From my testing it only removes them, but you should be able to go into them again by clicking on the reply in your inbox.

    • Dark Arc
      link
      fedilink
      English
      -1
      edit-2
      1 year ago

      This is another way of saying “boys will be boys” and ignoring the problem.

      I don’t think that’s at all similar. “Boys will be boys” is “we know it’s bad, but we can’t stop them.”

      The argument is… is it really bad? After all, isn’t it the “scandal” that really causes the damage? It’s not like any harm is directly done to the person, someone could’ve already done this to me, and well, I wouldn’t be any the wiser. It’s when they start sharing it and society reacts as if it’s real and there’s something scandalous that there’s a problem.

      If we stop considering it scandalous… The problem kind of goes away… It’s not much different than AI photoshopping a hat on someone that they may or may not approve of.

      This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take.

      I’ve never researched these tools or used them… But I’d wager that’s going to be next to impossible. If you think the war on drugs was bad… A war on a particular genre of software would be so much worse.

      Like a lot of things… I think this is a question of how do we adapt to new technology not how do we stop it. If I actually believed this was stoppable, I might agree with you… But it actually seems more dangerous if we try and make the tools hard to obtain vs just giving people plausible deniability.

      You mentioned bullying, definitely empathetic to that. I don’t know that this would really make things worse vs the “I heard Katie …” rumor crap that’s being going on for decades. Feminism has argued for taking the power away by removing the taboo of women having sex lives … and that seems equally relevant here.

      Either way, it really seems like a lot more research is needed.

      • @Eranziel
        link
        English
        61 year ago

        “Just stop considering it scandalous” is a severe lack of imagination. Even if/when the stigma of “having a sex life” is gone, the great majority of people consider their sex life to be private. Video floating around that looks like you having sex is a very different thing to hearsay rumors.

        Keep in mind that the exact same techniques could be used to sabotage adult relationships, marriages, careers, just as easily as teenage bullying. This isn’t a problem society can shrug away by saying sex should be less stigmatized.

        • Dark Arc
          link
          fedilink
          English
          0
          edit-2
          1 year ago

          Keep in mind that the exact same techniques could be used to sabotage adult relationships, marriages, careers, just as easily as teenage bullying.

          And a “video” should ruin those things why?

          Literally everything you listed is because society is making a big stink of things that don’t matter.

          Why should your job care … even if it’s real?

          If somebody didn’t cheat and there’s no other reason to believe that than a … suspiciously careless video of someone that looks like them… Why in the world should that end their relationship?

          Not to mention, AI isn’t going to get the details right. It’s going to get the gist of it right but anyone who’s actually seen you naked is presumably going to be able to find some details that are/aren’t off.

          Also in terms of privacy, your privacy wasn’t violated. Someone made a caricature of you.

          Video floating around that looks like you having sex is a very different thing to hearsay rumors.

          It’s really not, the only reason it is, is because video has been trustworthy for the past century, now it’s not.

          I hope you folks down voting me have some magic ace up your sleeve, but I see no way past this other than through it. Just like when the atom bomb was invented, it’s technology that exists now and we have to deal with it. Unlike the atom bomb, it’s just a bunch of computer code and at some point pretty much any idiot is going to be able to get their hands on convincing versions of it. Also unlike the atomic bomb, it can’t actually kill you.

  • @alienanimals
    link
    English
    271 year ago

    AI and deepfakes aren’t going to stop. Schools need to get with the times rather than pretending like it’s the year 1960.

    Teachers should be able to deliver meaningful punishments to students. If someone gets caught passing these around, then that person should catch some flak. And none of that punishing the victim and the bully like most schools do.

    • @[email protected]
      link
      fedilink
      English
      -11 year ago

      Schools won’t care. They never did and never will. This will just be a new era for bullies to use.

      They should actually teach about it. Maybe even teach how to use it.

  • Heratiki
    link
    fedilink
    English
    8
    edit-2
    1 year ago

    How would you police this without direct abuse?

    It’s pretty easy to spot deep fakes, even now. The type of porn being created in deep fakes are just too unbelievable when it comes to actors and actresses. They’re not deep faking intimate love porn, it’s nearly always straight up deep hardcore pornography that is being made when deep fakes are involved. I feel like everything described that is so evil is just a straw man argument. Hell anyone that believes deep hardcore pornography is what just happens in reality is a moron. The amount of bullshit incest porn on these same websites is just bonkers. That being said I can see how it can affect some people.

    But guess what? Humans tend to look similar so how do you go stop it when you don’t know if it’s real or fake? How crazy easy will it be to create yet another advantage to those in power/financial success? Examples:

    Politician is seeing a prostitute and abusing his status to do so. The prostitute records a secret sex tape of him raping her and threatening to have her arrested if she doesn’t submit to what he wants. This video goes public. Politician claims it’s a deep fake and prostitute is arrested anyway. Or the reverse. A prostitute deep fakes the video and threatens the politician. The politician just had information coming out of him glancing at another woman than his wife before the deep fake and so the populace just sides with the prostitute and the politician is arrested.

    Or how about a woman who looks just like Taylor Swift decides she wants to work in pornography. Her likeness is immediately noticed and it’s part of her popularity but not billed as such. T swizzle claims it’s a deep fake to disparage her and the porn actress is ruin if not sued into oblivion.

    So many scenarios could go either way. You can’t ban the technology because you’ll never be able to legitimately be able to know which is which. And just like cryptography banning it will not limit access to those who would use it lawfully.

    So what’s the solution? Get over the lunacy of the whole event? What options do we really have? And being we don’t have many/any options all we’re doing is sending clicks to news sites who have nothing else to write about. I’m not saying it’s not a problem, just not seeing a solution and don’t see a need to continually beat a dead horse.

  • @[email protected]
    link
    fedilink
    English
    11 year ago

    Society’s views on sexuality will change before we will EVER get a serious handle on deepfakes. If you’re rich and can afford the lawyers, go ham and sue, otherwise, time to just accept that humans are animals and animals fuck.

    Whether or not someone is or is not in a porn video is less important than whether or not they can do whatever job or task they’ve been given.

    Religious puritanical morons and prudes need to stfu and get over it, the victims need to cope with reality that this is never going away and they can spend their entire life and fortune on ‘finding the one who did this’ or just move on and put energy into something worthwhile.

    Even complaining about this is hysterically moronic. The ‘big threat’ is fake porn.

    Fixing the child care system so that child abuse, emotional, physical and sexual gets reduced even 1% would be immensely more worthwhile a task than literally any ending of a pursuit against a technology that is open source and widely available, not to mention even if it was made illegal in your country, good luck actually enforcing a law like that without going 110% dystopia with locked down internet that would make current chinese life look like a kind big brother system.

    • @echo64
      link
      English
      211 year ago

      So, I think it’s good to identify why non-consenting pornography is considered a bad thing. And why it being considered a bad thing is different than what you are saying here.

      Deep fake pornography for the people targeted (which is not just famous people) is incredibly invasive. Your image is out there doing things that you would be horrified to have on camera. It can destroy people’s health and cause huge problems, especially for people who are being harassed by others.

      It’s not Pearl clutching. It’s a rather damaging technology that benefits no one.

      • @fubo
        link
        English
        11 year ago

        By default, creating and publishing “deepfake porn” of a real person constitutes defamation against that person, as it carries the false statement “this person posed for this picture” which is likely to cause that person harm. Often, the intention is to cause harm.

        As such, we don’t need new laws here. Existing laws against defamation just need to be applied.

        • @sfgifz
          link
          English
          11 year ago

          Let’s see you put your money where your mouth is after we get some deep fakes of your turd eating fetishes get sent to your friends.

        • @[email protected]
          link
          fedilink
          English
          11 year ago

          Look folks, it’s the Simone Biles of mental gymnastics. You have some serious growing up to do if that’s your argument. Just because it’s potentially fake doesn’t make it any less of an invasion of privacy. So your argument is that everyone shouldn’t give a shit about privacy, especially their own?