Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • @afraid_of_zombies
    link
    English
    4210 months ago

    Maybe we do live in the best possible world. Wow wouldn’t it be great to get rid of this industry so you can consume porn while knowing that there is zero percent chance this wasn’t made without their consent?

    • hh93
      link
      fedilink
      English
      1210 months ago

      Isn’t the main problem with those models how you can create porn of everyone without their consent with those tools, too?

      • stevedidWHAT
        cake
        link
        English
        1510 months ago

        Sex trafficking vs virtual photoshop of your face…

        Nothing new, and it’s a huge improvement over the current status quo. Not everything needs to be a perfect solution

        • @N1cknamed
          link
          English
          -2
          edit-2
          7 months ago

          deleted by creator

      • @[email protected]
        link
        fedilink
        English
        510 months ago

        Yeah so what. It’s not as if somebody is “sold on the market” because there’s a nude picture of them. Photoshop is not a real threat to society. We gotta stop making moral imaginations more important than physical things.

      • @diffuselight
        link
        English
        1310 months ago

        I just retained an LLM on your comment you put on the public internet. You feel violated enough to equate it to physical violation?

        • @[email protected]
          link
          fedilink
          English
          110 months ago

          Why would I? Folks who have had real nudes of them posted on the Internet haven’t felt “physical violation” but they’ve certainly been violated.

          If you had photos of me and trained a porn generating LLM on my photos and shared porn of me, in an identifiable way, I would consider that violation.

          But simply taking my words in that simple sentence isn’t identifiable, unique, or revealing. So no.

          Further, the original point was about the ethics of AI porn. You can’t get something from nothing.

        • Urist
          link
          fedilink
          English
          010 months ago

          How would you respond to photo realistic porn that looks like your mother, daughter, [insert person you care about here] especially if they found it distressing?

          How would you feel if it was posted on facebook? How would you feel if they had to deal with it at work? From coworkers? From clients?

          We are entering uncharted waters. You know why this is different than training a model on text, and your reply to @[email protected] is hostile and doesn’t acknowledge why people would be upset about AI porn featuring their likeness.

          • @diffuselight
            link
            English
            11
            edit-2
            10 months ago

            you are answering a question with a different question. LLMs don’t make pictures of your mom. And this particular question?. One that has roughly existed since Photoshop existed.

            It just gets easier every year. It was already easy. You could already pay someone 15 bucks on Fiver to do all of that, for years now.

            Nothing really new here.

            The technology is also easy. Matrix math. About as easy to ban as mp3 downloads. Never stopped anyone. It’s progress. You are a medieval knight asking to put gunpowder back into the box, but it’s clear it cannot be put back - it is already illegal to make non consensual imagery just as it is illegal to copy books. And yet printers exist and photocopiers exist.

            Let me be very clear - accepting the reality that the technology is out there, it’s basic, easy to replicate and on a million computers now is not disrespectful to victims of no consensual imagery.

            You may not want to hear it, but just like with encryption, the only other choice society has is full surveillance of every computer to prevent people from doing “bad things”. everything you complain about is already illegal and has already been possible - it just gets cheaper every year. What you want to have protection from is technological progress because society sucks at dealing with the consequences of it.

            To be perfectly blunt, you don’t need to train any generative AI model for powerful deepfakes. You can use technology like Roop and Controlnet to synthesize any face on any image from a singe photograph. Training not necessary.

            When you look at it that way, what point is there to try to legislate training with these arguments? None.

            • Urist
              link
              fedilink
              English
              -110 months ago

              I’m not making an argument to ban it. I’m just pointing out you’re pretending a model from text someone wrote is similar to a model that makes nonconcentual porn.

              I don’t think it can be banned, it’s just something they will need to encorperate into revenge porn laws, if it isn’t already covered.

              I’m just pointing out your comment sucked.

              • @diffuselight
                link
                English
                7
                edit-2
                10 months ago

                It’s already covered under those laws. So what are you doing that’s different from ChatGPT hallucinating here ?

                Those laws don’t spell out the tools (photoshop); they hinge on reproducing likeness.

                • Urist
                  link
                  fedilink
                  English
                  -510 months ago

                  Oh good, someone who has read every revenge porn law, ever. I’m glad they work exactly the same, in every nation and state.

                  Anyway, I must be hallucinating, true, because it seems you keep attacking what I’m saying, instead of defending the comment you made earlier that I took issue with, the one that points out you’re being needlessly hostile.

          • stevedidWHAT
            cake
            link
            English
            7
            edit-2
            10 months ago

            I can do this right now with photoshop dude what are you talking about. This just points to the need for more revenge porn laws.

            We don’t have to sit in the fire when we can crawl out. Are we still on fire? Yeah. Can we do something about that? Yeah!

            It seems like so many people these days want perfect solutions but the reality is that sometimes we have to make incremental solutions to erase the problem as much as we can.

            • @polymer
              link
              English
              5
              edit-2
              10 months ago

              And incidentally, this need for revenge porn laws is also a symptomatic issue with a separate cause. Technology always moved forward and with no relation to social advancement, where there is also no realistic “Genie being forced back in the bottle” scenario either.

              That being said, easier access to more powerful technology with lackluster recognition of personal responsibility doesn’t exactly bring happy prospects. lol…

              • stevedidWHAT
                cake
                link
                English
                110 months ago

                Agreed, personal responsibility went out the window a long time ago. Apathy reigns supreme.

          • @Donjuanme
            link
            English
            410 months ago

            Revenge porn/blackmail/exploitation will hopefully become much less obscene, not to the “let’s not prosecute this” levels, but maybe people can stop living in fear of their lives being ruined by malicious actors (unless that’s your kink, you do you).

            It will take/drive/demand a massive cultural shift, but if you asked me which world I would rather live in, and the options are one where people are abused and exploited, or one where people can visualize their perversions more easily (but content creators have a harder time making a living) I’ll take the second. Though I may have straw-manned a bit, it’s not something I’ve thought of outside of this forum thread.

          • @afraid_of_zombies
            link
            English
            210 months ago

            I wouldn’t be happy about it but me not being happy about something doesn’t mean I just get an override.

            I think the boat has sailed a bit on this one. You can’t really copyright your own image and even if you were some famous person who is willing to do this and fight the legal battles you still have to go up against the fact that no one is making money off of it. You might be able to get a news source to take down that picture of you but it is another thing to make it so the camera company can’t even record you.

            But hey I was saying for years that we need to change the laws forbidding photography of people and property without consent and everyone yelled at me that they have the right to use a telescoping lense to view whomever they wanted blocks away.

            The creeps have inherited the earth.

        • @[email protected]
          link
          fedilink
          English
          -110 months ago

          Are you actually asking?

          The jist is that LLM find similar “chunks” out content from their training set, and assemble a response based on this similarity score (similar to your prompt request).

          They know nothing they haven’t seen before, and the nicely of them is they create new things from parts of their training data.

          Obviously they are very impressive tools but the concern is you can easily take a model that’s designed for porn, feed it pictures of someone you want to shame, and have it generate lifelike porn of a non porn actor.

          That, and the line around “ethical” AI porn is blurry.

          • tal
            link
            fedilink
            2
            edit-2
            10 months ago

            They know nothing they haven’t seen before

            Strictly speaking, you arguably don’t either. Your knowledge of the world is based on your past experiences.

            You do have more-sophisticated models than current generative AIs do, though, to construct things out of aspects of the world that you have experienced before.

            The current crop are effectively more-sophisticated than simply pasting together content – try making an image and then adding “blue hair” or something, and you can get the same hair, but recolored. And they ability to replicate artistic styles is based on commonalities in seen works, but you don’t wind up seeing chunks of material just done by that artist. But you’re right that they are considerably more limited then a human.

            Like, you have a concept of relative characteristics, and the current generative AIs do not. You can tell a human artist “make those breasts bigger”, and they can extrapolate from a model built on things they’ve seen before. The current crop of generative AIs cannot. But I expect that the first bigger-breast generative AI is going to attract users, based on a lot of what generative AIs are being used for now.

            There is also, as I understand it, some understanding of depth in images in some existing systems, but the current generative AIs don’t have a full 3d model of what they are rendering.

            But they’ll get more-sophisticated.

            I would imagine that there will be a combination of techniques. LLMs may be used, but I doubt that they will be pure LLMs.

        • @[email protected]
          link
          fedilink
          English
          0
          edit-2
          10 months ago

          Ok, you know it’s trained on existing imagery right?

          Sure the net new photos aren’t net new abuses, but whatever abuses went into the training set are literally represented in the product.

          To be clear I’m not fully porn shaming here, but I wanted to clarify that these tools are informed from something already existing and cant be fully separated from the training data.

    • ax1900kr
      link
      English
      -4010 months ago

      hmmm sweetie but what about the only fans prostitutes? Racist much?