In the week since a Chinese AI model called DeepSeek became a household name, a dizzying number of narratives have gained steam, with varying degrees of accuracy […] perhaps most notably, that DeepSeek’s new, more efficient approach means AI might not need to guzzle the massive amounts of energy that it currently does.

The latter notion is misleading, and new numbers shared with MIT Technology Review help show why. These early figures—based on the performance of one of DeepSeek’s smaller models on a small number of prompts—suggest it could be more energy intensive when generating responses than the equivalent-size model from Meta. The issue might be that the energy it saves in training is offset by its more intensive techniques for answering questions, and by the long answers they produce.

Add the fact that other tech firms, inspired by DeepSeek’s approach, may now start building their own similar low-cost reasoning models, and the outlook for energy consumption is already looking a lot less rosy.

  • @[email protected]
    link
    fedilink
    English
    614 hours ago

    How about we do this: everyone type their questions, and when it’s windy and all those wind turbines start generating power, all the questions get answered.

    • @MutilationWave
      link
      English
      49 hours ago

      I like it. It’s like praying to the storm god.

      • @[email protected]
        link
        fedilink
        English
        19 minutes ago

        Assuming deepseek can actually be run locally you would just need a laptop, a dynamo, and the poetic edda to use as the installation prompt.

  • Vaggumon
    link
    fedilink
    English
    1820 hours ago

    This stinks of desperation to change the narrative.

    • @[email protected]OP
      link
      fedilink
      English
      3020 hours ago

      I mean, no, not really? “New AI is not as energy-efficient as first advertised” is just a special case of “AI is not as advertised”, i.e., the least surprising turn of events.