• @[email protected]
    link
    fedilink
    English
    -5
    edit-2
    3 days ago

    LLM inference can be batched, reducing the cost per request. If you have too few customers, you can’t fill the optimal batch size.

    That said, the optimal batch size on today’s hardware is not big (<100). I would be very very surprised if they couldn’t fill it for any few-seconds window.

    • flere-imsaho
      link
      fedilink
      English
      42 days ago

      i would swear that in an earlier version of this message the optimal batch size was estimated to be as large as twenty.

    • David GerardM
      link
      fedilink
      English
      103 days ago

      this sounds like an attempt to demand others disprove the assertion that they’re losing money, in a discussion of an article about Sam saying they’re losing money

      • @[email protected]
        link
        fedilink
        English
        -53 days ago

        What? I’m not doubting what he said. Just surprised. Look at this. I really hope Sam IPO his company so I can short it.

          • @[email protected]
            link
            fedilink
            English
            -3
            edit-2
            2 days ago

            Can someone explain why I am being downvoted and attacked in this thread? I swear I am not sealioning. Genuinely confused.

            @[email protected] asked how request frequency might impact cost per request. Batch inference is a reason (ask anyone in the self-hosted LLM community). I noted that this reason only applies at very small scale, probably much smaller than what OpenAI is operating at.

            @[email protected] why did you say I am demanding someone disprove the assertion? Are you misunderstanding “I would be very very surprised if they couldn’t fill [the optimal batch size] for any few-seconds window” to mean “I would be very very surprised if they are not profitable”?

            The tweet I linked shows that good LLMs can be much cheaper. I am saying that OpenAI is very inefficient and thus economically “cooked”, as the post title will have it. How does this make me FYGM? @[email protected]

            • @[email protected]
              link
              fedilink
              English
              92 days ago

              Can someone explain why I am being downvoted and attacked in this thread? I swear I am not sealioning. Genuinely confused.

              my god! let me fix that