I’m so glad I’m not growing up in this age of smartphones, social media, and bullshit generators. Life was hell enough in the 90s without all that noise.

  • @brucethemoose
    link
    3117 days ago

    TBH kids need a new culture/attititude towards digital media, where they basically assume anything they consume on their phones is likely bogus. Like a whole new layer of critical thinking.

    I think it’s already shifting this direction with the popularity of private chats at the expense of “public” social media.

    • Pennomi
      link
      English
      2117 days ago

      Yep, at this point your real nudes could leak and you can just plausibly claim they’re faked.

      Something will change, but I’m not sure where society will decide to land on this topic.

          • @[email protected]
            link
            fedilink
            116 days ago

            Birthmarks don’t seem like the kind of thing AI would generate (unless asked), though…

            (And, as model collapse sets in and generated images become more and more generic and average, things like birthmarks will become more and more unlikely…)

            • @[email protected]
              link
              fedilink
              116 days ago

              AI is quite unpredictable… it’s sort of only useful because of how random it is. But my point is that either the knowledge is public or private - there’s no situation where you can’t either deny or attribute it to public knowledge.

    • ObjectivityIncarnate
      link
      1517 days ago

      That’s definitely going to happen organically, especially since this is a genie that is definitely never going back in the bottle. It’s only going to become more convincing, more accessible, and more widespread, even for simply ‘self-contained’ use, especially by hormone-flooded teenagers.

      • @brucethemoose
        link
        114 days ago

        Even if all AI developement is outlawed, this minute, everywhere, the genie is already out of the bottle. Flux 1.dev is bascially photorealsitic for many situations, and there will always be someone hosting it in some sketchy jurisdiction.

    • @[email protected]
      link
      fedilink
      English
      117 days ago

      I kinda doubt anyone is getting “fooled” by these at this point, though that is a whole nother layer of horrible hell in store for us…

      Right now, we’re dealing with the most basic questions:

      • Is it immoral (and/or should it be illegal) for people to be trading pornographic approximations of you?
      • Is it immoral (and/or should it be illegal) for people to privately make pornographic approximations of you?
      • Is it immoral (and/or should it be illegal) to distribute software which allows people to make pornographic approximations of others?
      • @[email protected]
        link
        fedilink
        117 days ago
        1. Illegal
        2. Immoral
        3. If built specifially for? Illegal. If it’s a tool with proper blockages that ends up having a way to break away and make them anyway? No.