After Nine blamed an ‘automation’ error in Photoshop for producing an edited image of Georgie Purcell, I set out to find out what the software would do to other politicians.

  • @[email protected]
    link
    fedilink
    English
    5
    edit-2
    10 months ago

    The use of photo manipulation tools to create non-consensual revealing/nude/porn images is incredibly fucked up. I remember seeing multiple stories about lawsuits from teenage girls having these fake images made of them that circulated in their schools. It’s a violation, and it’s categorically wrong.

    It sounds strange to say it, but pornography has always been the tip of the spear for technology. It went VHS and killed Betamax. It was a very early adopter of the internet. Onlyfans.

    The non-consensual AI porn is the tip of the spear of what AI can do. How much disinformation and bullshit it’s going to introduce into the public square, and how it has absolutely zero ethics. You are going to view statements, interviews, etc. that aren’t real. Pure fabrications amplified by bot networks and useful idiots.

    This AI rollout has been like cars before seatbelts and lines on the road. New technology and pure chaos. Good luck looking for geriatric politicians to find a cure. They already took money to look the other way. That’s their real job.

    • CybranM
      link
      fedilink
      410 months ago

      It really is pandora’s box and we can’t do much to stop it. People need to get used to fakes and misinformation but we’ve already seen how poorly that’s turned out and it’ll only get worse from here.

      • HopeOfTheGunblade
        link
        fedilink
        210 months ago

        Used to living with the fallout, because I don’t expect people will get better at media literacy.

    • @jacksilver
      link
      310 months ago

      This isn’t about nonconsenual images, it’s about bias in AI models. They used the extend image feature in both images and because the models think women=sexy it produces them in bikini bottoms and men=business it puts them in suits.

      This is going to be an ongoing issue in how generative AI assumes things based on the prompt/input image - https://www.bloomberg.com/graphics/2023-generative-ai-bias/