Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • @[email protected]
    link
    fedilink
    English
    8
    edit-2
    1 month ago

    Datacenter LLM tranches are 7-8 H100s per user at full load which is around 4 kW.

    Multiply that by generation time and you get your energy used. Say it takes 62 seconds to write an essay (a highly conservative figure).

    That’s 68.8 Wh, so you’re right.

    Source: I’m an AI enthusiast

    • @bandwidthcrisis
      link
      English
      51 month ago

      Well that’s of the same order of magnitude as the quoted figure. I was suggesting that it sounded vastly larger than it should be.

    • JWBananas
      link
      English
      11 month ago

      Does that account for cooling? Storage? Networking? Non-H100 compute and memory?

      • @[email protected]
        link
        fedilink
        English
        11 month ago

        Nope. Just GPU board power draw. 60 seconds is also pretty long with how fast these enterprise cards are but I’m assuming they’re using a giant 450B or 1270B model.

    • @[email protected]
      link
      fedilink
      English
      01 month ago

      kW is a unit of instantaneous power; kW/s makes no sense. Note how multiplying that by seconds would cancel time out and return you power again instead of energy. You got there in the end, though.