• @RedditWanderer
    link
    English
    531 day ago

    Watch them suddenly try to ban this chinese code under the same stuff they didn’t want to go after tiktok for.

    • @brucethemoose
      link
      English
      17
      edit-2
      1 day ago

      Deepseek R1 runs with open source code from an American company, specifically Huggingface.

      They have their own secret sauce inference code, sure, but they also documented it on a high level in the paper, so a US company can recreate it if they want.

      There’s nothing they can do, short of a hitler esque “all open models are banned, you must use these select American APIs by law.” That would be like telling the US “everyone must use Bing and the Bing API for all search queries, anything else is illegal.”

      • @RedditWanderer
        link
        English
        101 day ago

        Ah well if it’s only hitler-esque stuff then I guess we’re safe? /s

      • @[email protected]
        link
        fedilink
        English
        41 day ago

        Clearly that will get US back to number 1, right? Just lock it all down, what could go wrong?

        • @brucethemoose
          link
          English
          11 day ago

          Or gasp advocate for open source development in the US?

          Unthinkable, right?

    • ✺roguetrick✺
      link
      English
      201 day ago

      At this point regulatory capture is expected in the states. But we don’t have a corrupt government. Not at all.

      • Sabata
        link
        fedilink
        English
        251 day ago

        Oh no, don’t make me torrent my illegal and unregulated Ai like a cool cyberpunk hacker.

    • @Grimy
      link
      English
      8
      edit-2
      1 day ago

      They are already talking about it.

      U.S. officials are looking at the national security implications of the Chinese artificial intelligence app DeepSeek, White House press secretary Karoline Leavitt said on Tuesday, while President Donald Trump’s crypto czar said it was possible that intellectual property theft could have been at play.

      https://archive.ph/t37xU

      • @[email protected]
        link
        fedilink
        English
        81 day ago

        They might try, but if their goal was to destabilizing western dominance for LLMs making it completely open source was the best way.

        This isn’t like TikTok. They have a server that hosts it, but anyone can take their model and run it and there are going to be a lot of us companies besides the big Ai ones looking at it. Even the big Ai ones will likely try to adapt the stuff they’ve spent to long brute forcing to get improvement.

        The thing is, it’s less about the actual model and more about the method. It does not take anywhere close to as many resources to train models like deepseek compared to what companies in the US have been doing. It means that there is no longer going to be just a small group hording the tech and charging absurd amounts for it.

        Running the model can be no more taxing than playing a modern video game, except the load is not constant.

        The cat is out of the bag. They could theoretically ban the direct models released from the research team, but retrained variants are going to be hard to differentiate from scratch models. And the original model is all over the place and have had people hacking away at it.

        Blocking access to their hosted service right now would just be petty, but I do expect that from the current administration…

        • @brucethemoose
          link
          English
          2
          edit-2
          1 day ago

          Running the model can be no more taxing than playing a modern video game, except the load is not constant.

          This is not true, Deepseek R1 is huge. There’s a lot of confusion between the smaller distillations based on Qwen 2.5 (some that can run on consumer GPUs), and the “full” Deepseek R1 based on Deepseekv3

          Your point mostly stands, but the “full” model is hundreds of gigabytes, and the paper mentioned something like a bank of 370 GPUs being optimal for hosting. It’s very efficient because its only like 30B active, which is bonkers, but still.