• @normalexit
    link
    15 hours ago

    The cost is a function of running an LLM at scale. You can run small models on consumer hardware, but the real contenders are using massive amounts of memory and compute on GPU arrays (plus electricity and water for cooling).

    ChatGPT is reportedly losing money on their $200/mo pro subscription plan.