In this video I discuss how generative AI technology has grown far past the governments ability to effectively control it and how the current legislative measures could lead to innocent people being jailed.

  • mo_ztt ✅
    link
    English
    20
    edit-2
    1 year ago

    What the hell is this guy?

    “Here’s a case where people made and shared fake nudes of real underage girls, doing harm to the girls”

    “But what the hell, that’s kind of hard to stop. Oh also here’s this guy who went to prison for it because it’s already illegal.”

    “Really the obvious solution everyone’s missing is: If you’re a girl in the world, just keep images of yourself off the internet”

    “Problem solved. Right?”

    I’m only slightly exaggerating.

    • spezOP
      link
      fedilink
      English
      21 year ago

      He is a deepfake of luke smith.

    • spezOP
      link
      fedilink
      English
      01 year ago

      Also, I think the most governments would be able to do is to increase the friction of this process by giving all ai-gen photos an ‘id’ to track later and probably controlling open-source models, but that’s harder to do. Most probably old senators who don’t know gmail will pass unenforceable laws which won’t do jackshit but get them votes.

      • mo_ztt ✅
        link
        English
        12
        edit-2
        1 year ago

        The point I’m trying to make is, you don’t even have to do that.

        There are already laws against revenge porn and realistic child porn. You don’t have to “prevent” this stuff from happening. That is, as he accurately points out, more or less impossible. But, if it happens you can absolutely do an investigation, and if you can find out who did it, you can put them in jail. That to me sounds like a pretty good solution and I’m still waiting to hear what his issue is with it.

        • spezOP
          link
          fedilink
          English
          11 year ago

          I don’t have any problems with the points you discussed either. Can’t speak for him though.