Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

    • @[email protected]
      link
      fedilink
      549 months ago

      It’s much easier to do now. You should be able to do several in a single minute and the barrier to entry of using the software is way lower than Photoshop. Legally though, these seem indistinguishable.

    • @fidodo
      link
      English
      24
      edit-2
      9 months ago

      They’re easier to create and more realistic. The prevalence and magnitude of an immoral act impacts how it should be legislated. Personally I don’t care if people make these and keep it to themselves, but as soon as you spread it I think it’s immoral and harassment and there should be laws to prevent it.

    • @[email protected]
      link
      fedilink
      179 months ago

      Probably should have sued those people too… People need to cut this shit out. You’re fucking with others people’s life’s.

      • @PoliticalAgitator
        link
        -59 months ago

        They’re not going to. There is an insane amount of entitlement around people’s jerk off material. Right here on Lemmy, I’ve seen someone (who denied being a child) call pornography a “human right” and groups of people insisting they should be able to openly trade images of child rape as long as they’re AI generated.

        • @Cryophilia
          link
          138 months ago

          Fuck you people who equate pornography with child porn. You know what you’re doing, you sick bastards.

          Pornography is not at all the same thing as child porn. Do not speak about them in the same way.

          • @PoliticalAgitator
            link
            -38 months ago

            I didn’t, but don’t let that stop you throwing a tantrum and proving my point.

        • @Fedizen
          link
          48 months ago

          That sounds like an insane amount of entitlement from the one guy you found. Hopefully that entitles you to ignore everyone with even a fraction more nuance.

          • @PoliticalAgitator
            link
            08 months ago

            How dare I ignore the many subtle layers of nuance in “Using AI to create pornographic images of a woman and then sending them to her so she knows you’ve done it”.

        • Flying Squid
          link
          28 months ago

          and groups of people insisting they should be able to openly trade images of child rape as long as they’re AI generated.

          “Be able to” in what sense? Morally and ethically? No, absolutely not obviously. But what would the legal reason be to make it illegal since no actual children were involved? If I paint an explicit painting of a child being raped, is that illegal? I don’t think it would be. It would certainly give people good reason to be suspicious of me, but would it be illegal? And would an AI-generated image really be different?

          • @PoliticalAgitator
            link
            -18 months ago

            But what would the legal reason be to make it illegal since no actual children were involved

            Prove it. Trawl through thousands and thousands of images and videos of child sexual assualt and tell me which ones were AI generated and which were not. Prove the AI hadn’t been set up to produce CSAM matching a real child’s likeness. Prove it won’t normalize and promote the sexual assault of real children. Prove it wasn’t trained on images and videos of real children being raped.

            Legalising AI-generated child pornography is functionally identical to legalising all child pornography.

            • Flying Squid
              link
              38 months ago

              Legalizing or already legal? Because that’s my question. I don’t think it would be illegal, at least not in the U.S. I can’t speak for other countries, but here, proving a negative in court isn’t a thing.

    • @inspxtr
      link
      39 months ago

      I think porn generation (image, audio and video) will eventually be very realistic and very easy to make with only a few clicks and some well crafted prompts. Things would just be a whole other level that what Photoshop used to be.

  • @[email protected]
    link
    fedilink
    539 months ago

    I said it before, banning this doesn’t work. Legislation will always play catch up to the ever improving technologies, the best we can do is flood the internet with ai porn. Of everyone, as much as possible. To the point nobody even cares any more, because there is nudity of everyone. Normalize it before it ruins lives

    • @Fedizen
      link
      278 months ago

      or lean into it; ban clothing

      • @AeonFelis
        link
        English
        68 months ago

        Big fashion will never let it pass.

      • Flying Squid
        link
        58 months ago

        Global warming hasn’t made that an option around here yet. Give it 10 years.

      • GladiusB
        link
        28 months ago

        You haven’t seen some of us without clothes

        • @Fedizen
          link
          18 months ago

          not according to this AI generated image I just made of “everybody naked”

    • @Cryophilia
      link
      168 months ago

      The other option, and the more likely one, is an extremely broad law that is intended to account for future technologies but will actually be used to further erode civil liberties.

    • myxi
      link
      fedilink
      English
      3
      edit-2
      8 months ago

      Whatever I search on Pinterest, Google, Bing, the images there nowadays are mostly just AI generated. I am so used to them by now, I just don’t care anymore. Whatever makes me feel like it’s cool, I praise it. Recently hyper realistic AI generated videos have been popping up, and once there’s enough of datasets of free porn videos, which is most definitely coming out in a few years, the Porn industry is going to be filled with AI generated porn videos as well.

      I think AI generated porn videos are going to be very realistic because there’s so much free porn.

  • @nexusband
    link
    369 months ago

    This is going to be a serious issue in the future - either society changes and these things are going to be accepted or these kind of generating ai models have to be banned. But that’s still not going to be a “security” against it…

    I also think we have to come up with digital watermarks that are easy to use…

    • Justin
      link
      fedilink
      English
      259 months ago

      Honestly, I see it as kinda freeing. Now people don’t have to worry about nudes leaking any more, since you can just say they’re fake. Somebody starts sending around deepfakes of me? OK, whatever, weirdo, it’s not real.

      • Dadd Volante
        link
        fedilink
        299 months ago

        I’m guessing it’s easier to feel that way if your name is Justin.

        If it was Justine, you might have issues.

        Weird how that works.

        • Justin
          link
          fedilink
          English
          179 months ago

          Fair enough. Ideally it would be the same for women too, but we’re not there as a society yet.

          • @[email protected]
            link
            fedilink
            -25
            edit-2
            9 months ago

            Such an empty response. Do you know that women have to do things on dates out of fear of being killed? Literally they have a rational fear of being killed by their male dates and it’s a commonly known and accepted fear that many women relate.

            Society moving forward is a nice idea, women feeling safe is much better one and attitudes like yours are part of the reason women generally do not feel safe. Deepfakes are not freeing at all.

            • @fustigation769curtain
              link
              79 months ago

              Literally they have a rational fear of being killed by their male dates and it’s a commonly known and accepted fear that many women relate.

              No joke, stop dating shitty men.

              I get that’s too difficult for a lot of them, though.

              • @gothic_lemons
                link
                18 months ago

                How do you identify shitty men? They don’t wear labels. In fact they do their best to hide their shitty behavior at first.

                • @fustigation769curtain
                  link
                  48 months ago

                  Learn from your experiences, the experiences of others, and try to be a better judge of character. Don’t hang around bad crowds.

                  Unfortunately, most people can’t rise above peer pressure and think a group is always correct even if it’s comprised of shitty people.

                  What you don’t do is keep doing the same thing and expecting different results.

            • @zzx
              link
              79 months ago

              Bruh relax

            • @Cryophilia
              link
              58 months ago

              YOU ARE NOT ANGRY ENOUGH

              is this the new “conform”? “Be angry”?

      • Aniki 🌱🌿
        link
        fedilink
        English
        119 months ago

        Poison the well is how we free ourselves from the vastness of the digital landscape that encompasses us. Make all data worthless.

    • @fidodo
      link
      English
      209 months ago

      I think there’s a big difference between creating them and spreading them, and putting punishments on spreading nudes against someone’s will, real or fake is a better 3rd option. The free speech implications of banning software that’s capable of creating them is too broad and fuzzy, but I think that putting harsh penalties on spreading them on the grounds of harassment would be clear cut and effective. I didn’t see a big difference in between spreading revenge porn and deep fakes and we already have laws against spreading revenge porn.

    • @Lemming6969
      link
      39 months ago

      With ai and digital art… What is real? What is a person? What is a cartoon or a similar but not same likeness? In some cases what even is nudity? How old is an ai image? How can anything then be legal or illegal?

      • @Cryophilia
        link
        18 months ago

        IF CHEWBACCA LIVES ON ENDOR, YOU MUST ACQUIT

    • @[email protected]
      link
      fedilink
      -4
      edit-2
      9 months ago

      We gotta ban photo editing software too. Shit, we gotta ban computers entirely. Shit, now we have to ban electricity.

      • @[email protected]
        link
        fedilink
        -6
        edit-2
        9 months ago

        I’m so tired of this “Don’t blame the tool” bs argument used to divert responsibility.

        Blame the fucking tool and restrict it.

        • @fidodo
          link
          English
          209 months ago

          Why not blame the spread? You can’t ban the tool, it’s easily accessible software and that only requires easily accessible consumer hardware, and you can even semi easily train your own models using easily accessible porn on the Internet, so if you want to ban it outright, you’d need to ban the general purpose tool, all porn, and the knowledge to train image generation models. If you mean ban the online apps that sell the service on the cloud, I can get behind that, it would increase the bar to create them a little, but that is far from a solution.

          But, we already have laws against revenge porn and Internet harassment. I think the better and more feasible approach that doesn’t have far reaching free speech implications would be to simply put heavy penalties on spreading nudes images of people against their will, whether those images are real or fake. It’s harassment as revenge porn, and I didn’t see how it’s different if it’s a realistic fake. If there is major punishment for spreading these images then I think that will take care of discouraging the spread of the images for the vast majority of people.

        • littleblue✨
          link
          49 months ago

          Blame the fucking tool and restrict it.

          I mean. It’s worked so well with you so far, why not?

          • @nexusband
            link
            19 months ago

            Social media as a business model? Yes, absolutely.

              • @fidodo
                link
                English
                49 months ago

                The companies that host and sell an online image to nude service using a tuned version of that tool specifically designed to convert images into nudes are definitely a business model.

                I agree it’s impractical and opens dangerous free speech problems to try and ban or regulate the general purpose software, but, I don’t have a problem with regulating for profit online image generation services that have been advertising the ability to turn images into nudes and have even been advertising their service on non porn sites. Regulating those will at least raise the bar a bit and ensure that there’s isn’t a for profit motive where capitalism will encourage it happening even more.

                We already have revenge porn laws that outlaw the spread of real nudes against someone’s will, I don’t see why the spread of fakes shouldn’t be outlaws similarly.

                • @[email protected]
                  link
                  fedilink
                  19 months ago

                  And I think if those companies can be identified as making the offending image, they should be help liable. IMO, you shouldn’t be able to use a photo without the permission of the person.

    • @j4k3
      link
      English
      -9
      edit-2
      9 months ago

      deleted by creator

      • @LethalSmack
        link
        15
        edit-2
        9 months ago

        Where did it say anything about a Ministry of Truth deciding what can be posted online? Making it illegal and having a 3rd party decide if every post is allowed are two very different things

        If it’s illegal then there are ramifications for the platform, the user posting it, and the tool that created it.

        Content moderation is already a thing so it’s nothing new. Just one more thing on the list to check for when a post is reported

        • @Cryophilia
          link
          08 months ago

          Making it illegal and having a 3rd party decide if every post is allowed are two very different things

          Depends on the scale. If you’re a black man in the South in 1953, having a 3rd party decide whether you can do something means you can’t do that thing.

          I’m not speaking to this particular topic, just saying in general 3rd parties can be corrupted. It’s not a foolproof solution or even always a good idea.

          • @LethalSmack
            link
            2
            edit-2
            8 months ago

            I agree. It’s a terrible idea for many reasons. The fact that we can’t trust something like that to run in good faith is among the top of those reasons.

            The comment I was responding to was saying this proposed law would strip our ability to speak our mind because it would create a new 3rd party group that would validate each post before allowing them online.

            I was pointing out that making specific content illegal is not the same as having every post scrutinized before it goes live.

        • @j4k3
          link
          English
          -2
          edit-2
          9 months ago

          deleted by creator

          • @LethalSmack
            link
            59 months ago

            Well, you’re about 20 years too late. It has already started

            See any of the tor sites for examples of what is currently filtered out of the regular internet. It even gets your google account permanently banned if you log in via the tor browser

      • @nexusband
        link
        -29 months ago

        Yeah, sorry - I disagree on every level with your take.

        I am also convinced that at least the LLMs will soon destroy themselves, due to the simple fact that “garbage in, garbage out”.

        • @j4k3
          link
          English
          -59 months ago

          deleted by creator

    • @fustigation769curtain
      link
      -159 months ago

      It’s not a serious issue at all.

      Of course, if you’re the kind of greedy/lazy person who wants to make money off of pictures of their body, you’re going to have to find a real job.

  • @sir_pronoun
    link
    English
    -99 months ago

    I seriously don’t get why society cares if there are photos of anyone’s private parts.

    • @[email protected]
      link
      fedilink
      369 months ago

      I think we as a society are too uptight about nudity, but that doesn’t mean that creating pictures of people without their consent, which make them feel uncomfortable, is in any way OK.

      • @[email protected]
        link
        fedilink
        -99 months ago

        What about photos of politics in compromising situations? Should we have them without their consent?

      • @damnthefilibuster
        link
        269 months ago

        They are humiliated only because society has fed them the idea that what they’ve done (in this case not done but happened to them) is wrong. Internalizing shame meted out by society is the real psychological problem we need to fix.

        • @[email protected]
          link
          fedilink
          119 months ago

          Society does indeed play a big role, but if someone went around telling lies about you that everyone believed regardless of how much you denied it, that would take a toll on you.

        • @eatthecake
          link
          79 months ago

          Who are you tell people how they ought to feel? The desire for privacy is perfectly normal and you are the one trying to shame people for not wanting naked pictures of themselves everywhere.

        • @sir_pronoun
          link
          English
          79 months ago

          That’s what I meant. Why should it be shameful? If it weren’t, those photos would lose so much of their harm.

    • natecheese
      link
      fedilink
      169 months ago

      I think the issue is that there is sexual imagery of the person being created and shared without that persons consent.

      It’s akin to taking nude photos of someone without their consent, or sharing nude photos with someone other than their intended audience.

      Even if there were no stigma attached to nudes, that doesn’t mean someone would their nudes to exist or be shared.

    • @[email protected]
      link
      fedilink
      109 months ago

      I’ll continue this conversation in good faith only after you’ve shown us yours to prove your position.

    • @[email protected]
      link
      fedilink
      English
      -29 months ago

      Modern surveillance capitalism has made sharing of private data normalised. These days we are very used to sharing pretty much everything about ourselves, in addition to having no control over how that information is used. That is a bad thing.

      I suspect that these tools will, similarly, make nudity a bit more normalised in many societies across the world. That is probably a good thing overall.

      • @eatthecake
        link
        39 months ago

        What you mean to say is that non consensual nude pictures of women will be normalised and you’re ok with that. Sexual assault and domestic violence are also pretty common, you want to normalise those too?

        • @Cryophilia
          link
          18 months ago

          In my opinion, an eventual loss of prudishness is just a silver lining on the cloud of surveillance capitalism.

          • @eatthecake
            link
            18 months ago

            This is exactly the same as slut shaming just in the opposite direction. Prude is a word designed to shame women who refuse to be sex objects.

            • @Cryophilia
              link
              18 months ago

              Hilarious when you reactionary types try to pretend that giving women greater agency is somehow repressing women.

              • @eatthecake
                link
                08 months ago

                Forcing women to accept fake porn of themselves does not give women agency. It does the opposite.

                • @Cryophilia
                  link
                  -18 months ago

                  Keep your screeching to church, please. We don’t need to hear it.

  • @werefreeatlast
    link
    -209 months ago

    Someone at TikTok has all the power to make nudes off every one in the planet except for 5 homeless guys from LA that you don’t want a nude from anyway. Tiktok has the images of you (you idiot) and the hardware and software required to fake you to everyone you know.

    Welcome to China 2.0!

    • @[email protected]
      link
      fedilink
      99 months ago

      No, they don’t. Neither has Instagram, to my knowledge they have two, posted by other people. Now Grindr on the other hand…

      • @werefreeatlast
        link
        59 months ago

        Oh I would expect Grindr to call this a feature…

        You liked Jeff, but Jer, for an extra $7.53 you can automatically see him naked to reveal the full package!

    • @graymess
      link
      18 months ago

      Pal, what the fuck are you talking about? TikTok and China are not mentioned anywhere in this article and nowhere on TikTok is there an option to generate anyone’s likeness, clothed or unclothed.