Chinese artificial intelligence startup DeepSeek’s latest AI model sparked a $1 trillion rout in US and European technology stocks, as investors questioned bloated valuations for some of America’s biggest companies.

  • MudMan
    link
    fedilink
    627 days ago

    OK, hold on, so I went over to huggingface and took a look at this.

    Deepseek is huge. Like Llama 3.3 huge. I haven’t done any benchmarking, which I’m guessing is out there, but it surely would take as much Nvidia muscle to run this at scale as ChatGPT, even if it was much, much cheaper to train, right?

    So is the rout based on the idea that the need for training hardware is much smaller than suspected even if the operation cost is the same… or is the stock market just clueless and dumb and they’re all running on vibes at all times anyway?

    • @OmegaLemmy@discuss.online
      link
      fedilink
      627 days ago

      I thought everyone knew stocks were all vibes by now, private market might improve with competition but a public stock will always pick the most flashy option even if it’s shit just for appeal or they quite literally lose everything if it goes slightly wrong

    • @jacksilver
      link
      227 days ago

      Everything I’ve seen from looking into it seems to imply it’s on par for training and performance as other (LLM only) models.

      I feel like I’m missing something here or that the market is “correcting” for other reasons.

    • sunzu2
      link
      fedilink
      027 days ago

      Deepseek is the based on either llama or qwen, but can be put on top of any model?

      I tested qwen which sucked dick IMHO

      Now deepseek qwen is best thing I tried locally