Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • @[email protected]
    link
    fedilink
    English
    8
    edit-2
    1 month ago

    I’ve run an LLM on my desktop GPU and gotten decent results, albeit not nearly as good as what ChatGPT will get you.

    Probably used less than 0.1Wh per response.

    • @Monsieurmouche
      link
      English
      11 month ago

      Is this for inferencing only? Do you include training?

      • @[email protected]
        link
        fedilink
        English
        21 month ago

        Inference only. I’m looking into doing some fine tuning. Training from scratch is another story.

      • @[email protected]
        link
        fedilink
        English
        21 month ago

        Training is a one time thing. Tge more it get use, the less energy per query it will take

        • @Monsieurmouche
          link
          English
          130 days ago

          Good point. But considering the frequent retraining, the environmental impacts can only be spread on a finite number of queries.

          • @[email protected]
            link
            fedilink
            English
            130 days ago

            They have already reached diminishing returns on training. It will become much less frequent soon. Retraining on the same data if there isn’t a better method is useless. I think the ressources consumed per query should only include those actually used for inference. The rest can be dismissed as bad faith argumentation.