• @A_Very_Big_Fan
    link
    English
    -38 months ago

    Right, so I suppose George Lucas was stealing from all the movies that inspired his work when he made Star Wars. Or when Mel Brooks made Space Balls, as a more blatant example

    • gregorum
      link
      fedilink
      English
      4
      edit-2
      8 months ago

      Mel Brooks’s works are protected under the Fair Use provisions for satire under the DMCA. Lucas never copied anything directly, but, if pressed, much of his work is “heavily inspired” by works in the public domain and/or could be argued to be “derivative works”, also covered by Fair Use provisions in the DMCA, although any claim of copyright violation would be pretty difficult to make in the first place.

      • @A_Very_Big_Fan
        link
        English
        -2
        edit-2
        8 months ago

        And the same can be said about generative AI

        If it’s not redistributed copyrighted material, it’s not theft

        • gregorum
          link
          fedilink
          English
          3
          edit-2
          8 months ago

          And the same can be said about generative AI

          not in any legally reasonable way, and certainly not by anyone who understands how AI (or, really, LLM models) work or what art is.

          If it’s not redistributed copyrighted material, it’s not theft

          but that’s exactly what OpenAI did-- they used distributed, copyrighted works, used them as training data, and spit out result, some of which even contained word-for-word repetitions of the author’s source material.

          AI, unlike a human, cannot create unique works of art. it can old produce an algorithmically-derived malange of its source-data recomposited in novel forms, but nothing resembling the truly unique creative process of a living human. Sadly, too many people simply lack the ability to comprehend the difference.

          • @A_Very_Big_Fan
            link
            English
            -38 months ago

            it can old produce an algorithmically-derived malange of its source-data recomposited in novel forms

            Right, it produces derivative data. Not copyrighted material.

            By itself without any safeguards, it absolutely could output copyrighted data, (albeit probably not perfectly but for copyright purposes that’s irrelevant as long as it serves as a substitute). And any algorithms that do do that should be punished, but OpenAI’s models can’t do that.

            Hammers aren’t bad because they can be used for bludgeoning, and if we have a hammer that somehow detects that it’s being used for murder and then evaporates, calling it bad is even more ridiculous.

            • gregorum
              link
              fedilink
              English
              2
              edit-2
              8 months ago

              Some safeguards have been added which curtail certain direct misbehavior, but it is still capable - by your own admission - of doing it. And it still profits from the unlicensed use of copyrighted works by using such material for its training data. Because what it is producing is not a new and unique creative work, it is a composite of copyrighted work. That is not the same thing.

              And if you are comparing LLMs and hammers, you’re just proving how you fundamentally misunderstand what LLMs are and how they work. It’s a false equivalence.

              • @A_Very_Big_Fan
                link
                English
                -4
                edit-2
                8 months ago

                but it is still capable - by your own admission - of doing it

                And if you are comparing LLMs and hammers, you’re just proving how you fundamentally misunderstand what LLMs are and how they work

                And a regular hammer is capable of being used for murder. Which makes calling a hammer that evaporates before it can be used for murder “unethical” ridiculous. You’re deliberately missing the point.

                And it still profits from the unlicensed use of copyrighted works by using such material for its training data

                I just don’t buy this reasoning. If I look at paintings of the Eiffel Tower and then sell my own painting of the building, I’m not violating the copyright of any of the original painters unless what I paint is so similar to one of theirs that it violates fair use.

                it is a composite of copyrighted work

                It’s stable diffusion, not a composite. But even if they were composites, I’m allowed to shred a magazine and make a composite image of something else. It’s fair use until I use those pieces to create a copyrighted image.

                • gregorum
                  link
                  fedilink
                  English
                  28 months ago

                  Lol… I hope you didn’t sprain something with all those mental gymnastics. In the meantime, perhaps you should educate yourself a bit more on AI, LLV’s, and, perhaps, just a little bit on art.

                  • @A_Very_Big_Fan
                    link
                    English
                    -28 months ago

                    Coming from someone who claimed stable diffusion was a composite image