Visual artists fight back against AI companies for repurposing their work::Three visual artists are suing artificial intelligence image-generators to protect their copyrights and careers.

  • @FooBarrington
    link
    English
    21 year ago

    The issue isn’t you being concise, it’s throwing around words that don’t have a clear definition, and expecting your definition to be broadly shared. You keep referring to understanding, and yet objective evidence towards understanding is only met with “but it’s not creative”.

    • MentalEdge
      link
      fedilink
      English
      11 year ago

      Are you suggesting there is valid evidence modern ML models are capable of understanding?

      I don’t see how that could be true for any definition of the word.

      • @FooBarrington
        link
        English
        2
        edit-2
        1 year ago

        As I’ve shared 3 times already: Yes, there is valid evidence that modern ML models are capable of understanding. Why do I have to repeat it a fourth time?

        I don’t see how that could be true for any definition of the word.

        Then explain to me how it isn’t true given the evidence:

        Language models show a surprising range of capabilities, but the source of their apparent competence is unclear. Do these networks just memorize a collection of surface statistics, or do they rely on internal representations of the process that generates the sequences they see? We investigate this question by applying a variant of the GPT model to the task of predicting legal moves in a simple board game, Othello. Although the network has no a priori knowledge of the game or its rules, we uncover evidence of an emergent nonlinear internal representation of the board state. Interventional experiments indicate this representation can be used to control the output of the network and create “latent saliency maps” that can help explain predictions in human terms.

        https://arxiv.org/abs/2210.13382

        I don’t see how an emergent nonlinear internal representation of the board state is anything besides “understanding” it.

        • MentalEdge
          link
          fedilink
          English
          1
          edit-2
          1 year ago

          Cool. But this is still stuff that has a “right” answer. Math. Math in the form of game rules, but still math.

          I have seen no evidence that MLs can comprehend the abstract. To know, or more accurately, model, the human experience. It’s not even clear, that given a conscious entity, it is possible to communicate about being human to something non-human.

          I am amazed, but not surprised, that you can explain a “system” to an LLM. However, doing the same for a concept, or human emotion, is not something I think is possible.

          • @FooBarrington
            link
            English
            21 year ago

            Cool. But this is still stuff that has a “right” answer.

            What are you talking about? You wanted evidence that NNs can understand stuff, I showed you evidence.

            Math. Math in the form of game rules, but still math.

            Yes, and math can represent whatever you want. It can represent language, it can represent physics, it can even represent a human brain. Don’t assume we are more than incredibly complicated machines. If you want to argue “it’s just math”, then show me that anything isn’t just math.

            I have seen no evidence that MLs can comprehend the abstract. To know, or more accurately, model, the human experience. It’s not even clear, that given a conscious entity, it is possible to communicate about being human to something non-human.

            See? And that’s the handwaving. You’re talking about “the human experience” as if that’s a thing with an actual definition. Why is “the human experience” relevant to whether NNs can understand things?

            I am amazed, but not surprised, that you can explain a “system” to an LLM. However, doing the same for a concept, is not something I think is possible.

            And the next handwave - what is a concept? How is “the board in Othello” not a concept?

            • MentalEdge
              link
              fedilink
              English
              11 year ago

              Modern MLs are nowhere near complex enough to model reality to the extent required for genuine artistic expression.

              That you need me to say this using an essay instead of a sentence, is your problem, not mine.

              • @FooBarrington
                link
                English
                21 year ago

                Modern MLs are nowhere near complex enough to model reality to the extent required for genuine artistic expression.

                You’d have to bring up actual evidence for this. Easiest would be to start by defining “genuine artistic expression”. But I have a feeling you’ll just resort to the next handwave…

                Thank you for confirming that your position doesn’t make any sense.

                • MentalEdge
                  link
                  fedilink
                  English
                  11 year ago

                  Thank you for confirming that your position doesn’t make any sense.

                  Rude. Thanks for confirming my choice on minimizing the effort I spend on you, I guess.