• Domi
    link
    fedilink
    57 days ago

    Hosting a model of that size requires ~800GB of VRAM. Even if they release their models, it wouldn’t make them obsolete since most people and many companies couldn’t host it either way.

    • @[email protected]
      link
      fedilink
      English
      2
      edit-2
      6 days ago

      Anyone can now provide that service. Why pay OpenAI when you can pay a different service who is cheaper or provides a service more aligned with your needs or ethics or legal requirements?

      • Domi
        link
        fedilink
        16 days ago

        Anyone that has 300.000$ per instance, the know-how to set it up, the means to support it and can outbid OpenAI, yes.

        I don’t see that happening on a large scale, just like I don’t see tons of DeepSeek instances being hosted cheaper than the original any time soon.

        If they really are afraid of that they can always license it in a way that forbids reselling.