The narrative that OpenAI, Microsoft, and freshly minted White House “AI czar” David Sacks are now pushing to explain why DeepSeek was able to create a large language model that outpaces OpenAI’s while spending orders of magnitude less money and using older chips is that DeepSeek used OpenAI’s data unfairly and without compensation. Sound familiar?

Both Bloomberg and the Financial Times are reporting that Microsoft and OpenAI have been probing whether DeepSeek improperly trained the R1 model that is taking the AI world by storm on the outputs of OpenAI models.

It is, as many have already pointed out, incredibly ironic that OpenAI, a company that has been obtaining large amounts of data from all of humankind largely in an “unauthorized manner,” and, in some cases, in violation of the terms of service of those from whom they have been taking from, is now complaining about the very practices by which it has built its company.

OpenAI is currently being sued by the New York Times for training on its articles, and its argument is that this is perfectly fine under copyright law fair use protections.

“Training AI models using publicly available internet materials is fair use, as supported by long-standing and widely accepted precedents. We view this principle as fair to creators, necessary for innovators, and critical for US competitiveness,” OpenAI wrote in a blog post. In its motion to dismiss in court, OpenAI wrote “it has long been clear that the non-consumptive use of copyrighted material (like large language model training) is protected by fair use.”

OpenAI argues that it is legal for the company to train on whatever it wants for whatever reason it wants, then it stands to reason that it doesn’t have much of a leg to stand on when competitors use common strategies used in the world of machine learning to make their own models.

  • Ulrich
    link
    fedilink
    English
    458 days ago

    That’s why “value” is in quotes. It’s not that it didn’t exist, is just that it’s purely speculative.

    Hell Nvidia’s stock plummeted as well, which makes no sense at all, considering Deepseek needs the same hardware as ChatGPT.

    Stock investing is just gambling on whatever is public opinion, which is notoriously difficult because people are largely dumb and irrational.

    • Pasta Dental
      link
      fedilink
      English
      238 days ago

      Hell Nvidia’s stock plummeted as well, which makes no sense at all, considering Deepseek needs the same hardware as ChatGPT.

      It’s the same hardware, the problem for them is that deepseek found a way to train their AI for much cheaper using a lot less than the hundreds of thousands of GPUs from Nvidia that openai, meta, xAi, anthropic etc. uses

      • Ulrich
        link
        fedilink
        English
        3
        edit-2
        8 days ago

        The way they found to train their AI cheaper isn’t novel, they just stole it from OpenAI (not that I care). They still need GPUs to process the prompts and generate the responses.

    • @[email protected]
      link
      fedilink
      English
      108 days ago

      Hell Nvidia’s stock plummeted as well, which makes no sense at all, considering Deepseek needs the same hardware as ChatGPT.

      Common wisdom said that these models need CUDA to run properly, and DeepSeek doesn’t.

      • @tabular
        link
        English
        158 days ago

        CUDA being taken down a peg is the best part for me. Fuck proprietary APIs.

        • Fushuan [he/him]
          link
          fedilink
          English
          118 days ago

          They replaced it with a lower level nvidia exclusive proprietary API though.

          People are really misunderstanding what has happened.

          • @tabular
            link
            English
            58 days ago

            That’s a damn shame.

      • Ulrich
        link
        fedilink
        English
        18 days ago

        Sure but Nvidia still makes the GPUs needed to run them. And AMD is not really competitive in the commercial GPU market.

            • @Sanctus
              link
              English
              28 days ago

              Someone should just an make AiPU. I’m tired of all GPUs being priced exorbitantly.

              • Billiam
                link
                English
                18 days ago

                Okay, but then why would anyone make non-AiPUs if the tech is the same and they could sell the same amount at a higher cost?

                • @Sanctus
                  link
                  English
                  28 days ago

                  Because you could charge more for “AiPUs” than you already are for GPUs since capitalists have brain rot. Maybe we just need to invest in that open source GPU project if its still around.

                  • Billiam
                    link
                    English
                    18 days ago

                    That’s what I said.

                    If a GPU and a hypothetical AiPU are the same tech, but nVidia could charge more for the AiPU, then why would they make and sell GPUs?

                    It’s the same reason why they don’t clamp down on their pricing now: they don’t care if you are able to buy a GPU, they care that Twitter or Tesla or OpenAI are buying them 10k at a time.

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      8 days ago

      they need less powerful and less hardware in general tho, they acted like they needed more

      • @[email protected]
        link
        fedilink
        English
        48 days ago

        Chinese GPUs are not far behind in gflops. Nvidia advantage is CUDA, drivers, interconnection clusters.

        AFAIU, deepseek did use cuda.

        In general, computing advances have rarely resulted in using half the computers, though I could be wrong at the datacenter/hosting level at the maturity stage.

        • Fushuan [he/him]
          link
          fedilink
          English
          48 days ago

          Not cuda, but a lower level nvidia proprietary API, your point still stands though.

    • @[email protected]
      link
      fedilink
      English
      18 days ago

      “valuation” I suppose. The “value” that we project onto something whether that something has truly earned it.