• @[email protected]
    link
    fedilink
    English
    18 months ago

    What you’re giving as examples are legitimate uses for the data.

    If I write and sell a new book that’s just Harry Potter with names and terms switched around, I’ll definitely get in trouble.

    The problem is that the data CAN be used for stuff that violates copyright. And because of the nature of AI, it’s not even always clear to the user.

    AI can basically throw out a Harry Potter clone without you knowing because it’s trained on that data, and that’s a huge problem.

    • @A_Very_Big_Fan
      link
      English
      3
      edit-2
      8 months ago

      Out of curiosity I asked it to make a Harry Potter part 8 fan fiction, and surprisingly it did. But I really don’t think that’s problematic. There’s already an insane amount of fan fiction out there without the names swapped that I can read, and that’s all fair use.

      I mean hell, there are people who actually get paid to draw fictional characters in sexual situations that I’m willing to bet very few creators would prefer to exist lol. But as long as they don’t overstep the bounds of fair use, like trying to pass it off as an official work or submit it for publication, then there’s no copyright violation.

      The important part is that it won’t just give me the actual book (but funnily enough, it tried lol). If I meet a guy with a photographic memory and he reads my book, that’s not him stealing it or violating my copyright. But if he reproduces and distributes it, then we call it stealing or a copyright violation.

    • @A_Very_Big_Fan
      link
      English
      28 months ago

      I just realized I misread what you said, so that wasn’t entirely relevant to what you said but I think it still stands so ig I won’t delete it.

      But I asked both GPT3.5 and GPT4 to give me Harry Potter with the names and words changed, and they can’t do that either. I can’t speak for all models, but I can at least say the two owned by the people this thread was about won’t do that.