OpenAI now tries to hide that ChatGPT was trained on copyrighted books, including J.K. Rowling’s Harry Potter series::A new research paper laid out ways in which AI developers should try and avoid showing LLMs have been trained on copyrighted material.

  • @assassin_aragorn
    link
    English
    11 year ago

    I think they probably have criteria for what’s used to train it, but they don’t keep a list of what material was used. I believe they’ve said in the past they don’t have that information.

    For another AI – these models fall apart when they’re trained on AI generated content, after a few generations. If they have no way of discerning if content is AI generated or not, they’re going to have a ticking time bomb. At some point the models will heavily degrade in quality because of it. The question I guess is what % of training material can be AI generated before it causes problems.

    This does mean however that AI generated material can never become a substantial % of all the content out there. Whenever there’s too much, the algorithms will fall apart, and probably not recover until that content falls below a certain % again.

    • @kava
      link
      English
      11 year ago

      but they don’t keep a list of what material was used. I believe they’ve said in the past they don’t have that information.

      I will look into this. I feel like that’s quite an oversight. Perhaps it’s easier to just tell the public otherwise because of the legal questions like we are discussing. I would have kept everything in storage so we can re-train updated models or what have you with the same data.

      I think it’s an interesting thing you bring up. There will be a sort of distinction in the corpus of human works. Pre ~2023 and Post ~2023. All work before that time will more or less be legitimate and you can use it for training data. Afterwards it will all be tainted.

      Honestly the implications go further than that. For one, I don’t trust that there is a human behind any comment I see online anymore. Especially in topics and areas that I feel are likely to be astroturfed - like politics.

      • @assassin_aragorn
        link
        English
        11 year ago

        Perhaps it’s easier to just tell the public otherwise because of the legal questions like we are discussing.

        Very possible. I think they don’t want to keep things in storage because then they indisputably need to pay for it.

        Agreed on the human element too. The Reddit protests were eye opening for me because of the supposed “pro Reddit/anti mod” crowd that showed up as a vocal minority. They popped out of nowhere, and in some cases they were verified as AI bots.