With the proliferation of AI powered deepfakes at galactic speeds, there will be nobody (especially women) that will have shred of privacy left in few years. It does not matter if it is fake photo or not, nobody should be able to see “you” naked unless you allow it. But with rise of tools that are able to run on consumer level hardware, that seems like really losing battle. Since how can we police what a person can run or cannot run on his personal computer? That is another can of worms better not opened, since the idea of some agency being able to monitor what you do on your PC is another dystopia. Soon, you can never be too sure if your neighbor or coworker did not deepfaked you and now every time he looks at you he sees you as sexual object. That is highly uncomfortable though for sure.

Since we cannot possibly stop it, what is the best option moving forward? Normalizing it? Marginalizing it, since it is fake after all? Ignoring it? No option seems very good either.

This goes way beyond current framework of “revenge porn”, since when it comes to revenge porn, the case is simple - unlawful distribution without consent. But what about unlawful generation for personal use without consent? I cannot think of legal grounds that could make this criminal offense, since soon we would have to ban even drawing lewd doodles with pencil at home.

  • @[email protected]
    link
    fedilink
    English
    51 year ago

    I think these are two different scenarios. If the generated images are kept private it’s not too different from previous times. People could draw pictures, glue a head to a magazine page, photoshop things or just imagine stuff before. Sharing deep fakes is new though, I’d say this should be treated much like revenge porn as the damage is similar.

    • @Adequately_InsaneOP
      link
      English
      -2
      edit-2
      1 year ago

      The main problem with gluying stuff to magazine for example was that the result was not that great either and in the end it was not that enticing to do for would be faker. Plus it was obvious it is not you so most people would just laugh it off. But the deepfakes are whole different thing, without proper labeling it can be passed as real you and even if that person does not distribute it, it can be unnerving for some people just to think that someone has their nude photos that look like real thing without their consent and you never know who has those photos. That is why marginalization or ignoring it will be lot more harder than with the old school fakery.

      I for one will laugh it off. Wanna see fake naked picture of me and rub one off? Be my guest, I could care less. I might be even flattered. But then there is whole other group, that takes it somewhat seriously. That is why when most social networks started, private profiles were not a thing. But soon, bunch of people started thinking “ewww, I shared bunch of pictures of me online, and now what if someone rubs one off to my fully clothed pictures, how do I counter that, how can we stop those creeps?” Bam, and we got private profiles. But doing something like that now to stop AI gen seems kind of impossible, since the cat is already out of the bag and there is a good chance that person which has online presence of any kind will have at least one mugshot of them somewhere available. And even if not, there are always yearbooks. Or something.

  • BrikoX
    link
    fedilink
    English
    41 year ago

    All LLMs do is lower the bar for entry. Fake nudes were a thing long before LLMs, now instead of needing photo or video editing skills you can ask LLM to do it for you.

    And the majority of people don’t care about privacy until it doesn’t affect them negative personally, they put their whole life online for people to see and scrape that data. There is no stopping LLMs now, the time for that was 10+ years ago, but everyone ignored those people that sounded the alarm as privacy nuts or conspiracy theorists…

  • smoothbrain coldtakes
    link
    fedilink
    English
    11 year ago

    Deepfakes for porn are not the problem.

    Deepfakes of media and the propaganda therein, is the real problem.

    Does it suck that a person can have their clothed photos turned into porn? Sure, but it’s way smaller scale than the mass creation of propaganda that is being done with LLMs. In comparison deepfakes for nudes are practically a non-issue.