• @hogmomma
      link
      English
      23
      edit-2
      1 year ago

      Do nuclear reactors use coal?

      • @Dkarma
        link
        English
        31 year ago

        Removed by mod

        • @orrk
          link
          English
          21 year ago

          sorta does tho

          • @regbin_
            link
            English
            21 year ago

            It’s only stealing if you make it generate the copyrighted art and claim it as yours. Otherwise, it’s not any different than artists being inspired by existing art.

            • @orrk
              link
              English
              21 year ago

              literally, no finding or law supports the claim you have made, but there are several cases that have been ruled contrary to your statement. now sure, these didn’t pertain directly to AI, but they did pertain to the argument of an artist “being inspired by existing art”

          • @Dkarma
            link
            English
            -81 year ago

            Tell me u know nothing about AI without telling me you know nothing about AI…lol

            • @orrk
              link
              English
              11 year ago

              Yes, thanks for pointing out that you know nothing about “AI”, and by “AI” I assume you mean the iterative learning models.

      • @Jtotheb
        link
        English
        -91 year ago

        Does it matter what they claim they’re going to do differently in the future when they’re burning indefensible amounts of coal right now?

    • @regbin_
      link
      English
      71 year ago

      Let’s not bring that X/Twitter shit to Lemmy.

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        I’m genuinely floored this is the comment you were replying to. What does that even mean??!

        • @regbin_
          link
          English
          31 year ago

          Saying training generative AI models on artists’ work as stealing artwork.

          • @[email protected]
            link
            fedilink
            English
            -1
            edit-2
            1 year ago

            Because it literally is. If you knew the exact terms to get the the AI to recreate something in its training data, it could, 1:1. And if you ask it to create you something new, no matter what parameters you use it will look like a mess of garbage data. Generative AI is literally just art laundering just like how Language Models are writing laundering. We tend to use humanizing language but ultimately it’s a machine which uses a bunch of dials and levers to determine how much % a work should resemble one piece in its training at a particular point of the work and how much it should resemble another in another. There’s a reason why a lot of modern image bots have literal fucking watermarks all over their outputs. Because the images were flat out stolen.

            The tech itself is pretty neat, you’re essentially making a virtual brain and having it do useful work, but ultimately all the capitalists running these tools see it as is another method to bring the public under their exclusive and totalitarian control. We could have had a cool roboartist putting out new and unique works but instead we get people losing their job because an inept system hyped up by silicon valley fart huffers claimed it could do their work for free and it only gets worse as these AIs use their own garbage outputs as training data.

            • @regbin_
              link
              English
              6
              edit-2
              1 year ago

              If you knew the exact terms to get the the AI to recreate something in its training data, it could, 1:1.

              That’s because you told it to. Don’t make it recreate existing art then.

              And if you ask it to create you something new, no matter what parameters you use it will look like a mess of garbage data.

              This is not always true. You can train it on a certain style and a photo of a random object, then have it generate an image of the random object in that style. It will “understand” the concept of a style and an object.

              ultimately all the capitalists running these tools see it as is another method to bring the public under their exclusive and totalitarian control.

              Exactly why I’m not supporting the closed source paid services (Midjourney, ChatGPT, Bing Chat, DALL-E etc.) and instead advocate for open source projects like Stable Diffusion and LLaMA.

              • @[email protected]
                link
                fedilink
                English
                -3
                edit-2
                1 year ago

                That’s because you told it to. Don’t make it recreate existing art then.

                If you took a random concept and explained it to a person they could using their existing knowledge set, draw it somewhat competently. That is because people are able to apply knowledge to make something new. If you told someone to recreate something that already exists, even if they’re a professional, would never be able to recreate it no matter how much time and effort the put into it. AI can do the latter because it’s basically copying, and it can’t do the former because there’s nothing to copy from.

                • @regbin_
                  link
                  English
                  61 year ago

                  If you took a random concept and explained it to a person they could using their existing knowledge set, draw it somewhat competently. That is because people are able to apply knowledge to make something new.

                  Theoretically it can, but it would involve meticulous and proper labeling of each training data. Currently most of the trained data are automatically labeled and they’re not descriptive/verbose enough. I believe the improvements from the latest version of DALL-E is due to OpenAI’s use of a more advanced image labeler.

                  • @[email protected]
                    link
                    fedilink
                    English
                    01 year ago

                    Theoretically it can, but it would involve meticulous and proper labeling of each training data.

                    OK so throw more Kenyans at it. Got it!

    • prole
      link
      fedilink
      English
      41 year ago

      Coal? Did you miss the nuclear part?