• Echo Dot
    link
    fedilink
    English
    41 day ago

    Right but my understanding is you still need Open AIs models in order to have something to distill from. So presumably you still need 500 trillion GPUs and 75% of the world’s power generating capacity.

    • @InputZero
      link
      English
      251 day ago

      The message that OpenAI, Nvidia, and others which bet big on AI delivered was that no one else could run AI because only they had the resources to do that. They claimed to have a physical monopoly, and no one else would be able to compete. Enter Deepseek doing exactly what OpenAI and Nvidia said was impossible. Suddenly there is competition and that scared investors because their investments into AI are not guaranteed wins anymore. It doesn’t matter that it’s derivative, it’s competition.

      • Echo Dot
        link
        fedilink
        English
        3
        edit-2
        1 day ago

        Yes I know but what I’m saying is they’re just repackaging something that openAI did, but you still need openAI making advances if you want R1 to ever get any brighter.

        They aren’t training on large data sets themselves, they are training on the output of AIs that are trained on large data sets.

        • @InputZero
          link
          English
          11 day ago

          Oh I totally agree, I probably could have made my comment less argumentative. It’s not truly revolutionary until someone can produce an AI training method that doesn’t consume the energy of a small nation to get results in a reasonable amount of time. Which isn’t even mentioning the fact that these large data sets already include everything and that’s not enough. I’m glad that there’s a competitive project even if I’m going to wait a while and let smarter people than me sus it out.