• @[email protected]
    link
    fedilink
    English
    131 year ago

    I disagree. I think that there should be zero regulation of the datasets as long as the produced content is noticeably derivative, in the same way that humans can produce derivative works using other tools.

    • Hello Hotel
      link
      English
      1
      edit-2
      1 year ago

      Good in theory, Problem is if your bot is given too mutch exposure to a specific piece of media and when the “creativity” value that adds random noise (and for some setups forces it to improvise) is too low, you get whatever impression the content made on the AI, like an imperfect photocopy (non expert, explained “memorization”). Too high and you get random noise.

      • @[email protected]
        link
        fedilink
        English
        21 year ago

        if your bot is given too mutch exposure to a specific piece of media and when the “creativity” value that adds random noise (and for some setups forces it to improvise) is too low, you get whatever impression the content made on the AI, like an imperfect photocopy

        Then it’s a cheap copy, not noticeably derivative, and whoever is hosting the trained bot should probably take it down.

        Too high and you get random noise.

        Then the bot is trash. Legal and non-infringing, but trash.

        There is a happy medium where SD, MJ, and many other text-to-image generators currently exist. You can prompt in such a way (or exploit other vulnerabilities) to create “imperfect photocopies,” but you can also create cheap, infringing works with any number of digital and physical tools.

    • @adrian783
      link
      English
      -11 year ago

      LLM are not human, the process to train LLM is not human-like, LLM don’t have human needs or desires, or rights for that matter.

      comparing it to humans has been a flawed analogy since day 1.

      • King
        link
        English
        31 year ago

        Llm no desires = no derivative works? Let llm handle your comments they will make more sense