This is something I’ve been speculating for a while. The cost of running these complex systems (as OpenAI models aren’t just LLMs) is subsidized so heavily that we don’t really know the cost of running these things.
This is a huge risk to any business, as the price for these services has to go up significantly in the long term.
Is that for all operations or literally just to run the paid services? Cause if that includes the free services, marketing, R&D then they have a lot of options to cut costs.
Given what AWS/etc. charge for their LLMs/APIs it feels like the entire industry is subsidizing LLM compute to stay competitive. But I could be wrong there.
This is something I’ve been speculating for a while. The cost of running these complex systems (as OpenAI models aren’t just LLMs) is subsidized so heavily that we don’t really know the cost of running these things.
This is a huge risk to any business, as the price for these services has to go up significantly in the long term.
Ed Zitron calculated from the publicly available numbers that OpenAI was spending $2.35 for every $1 of ChatGPT they sell
Is that for all operations or literally just to run the paid services? Cause if that includes the free services, marketing, R&D then they have a lot of options to cut costs.
Given what AWS/etc. charge for their LLMs/APIs it feels like the entire industry is subsidizing LLM compute to stay competitive. But I could be wrong there.
Was it altman that tweeted they were near the singularity? I assumed it was a way to raise money. Felt more like “Fuck! We need more money to burn.”
they were only “near AGI” before their most recent funding rounds closed, after that they were “a few thousand days” away