Investors are barely breaking even as the venture is hardly making any profits due to a shortage of chips, divided interests, and more.

… OpenAI has already seen a $540 million loss since debuting ChatGPT.

… OpenAI uses approximately 700,000 dollars to run the tool daily.


⚠️ First off, apologies as I didn’t cross check. Take it w/ a grain of salt.


This piece of news, if true, somehow explains why OpenAI has been coming up w/ weird schemes for making $$$ like entering the content moderation space.

On a similar note, I wonder if this had been a key driver (behind the scenes) in the recent investment in open source AI initiatives (Haidra comes to my mind?) Perhaps some corporations who haven’t got enough $$$ to fund their own dedicated research group are looking to benefit from an open source model?

  • @[email protected]
    link
    fedilink
    English
    6311 months ago

    Too much is made of the shrinking user base. I’m sure they’ll come back with a vengeance come the start of the school year in the northern hemisphere.

    Also, maybe a tool like this shouldn’t be privately funded? Most of the technology is based on university funded research we all paid for. mRNA vaccine research was similarly funded with public money in mostly universities, and now we have to pay some private company to sell it back to us. How is that efficient? AI should be common property.

    • Uranium3006
      link
      fedilink
      2711 months ago

      honestly I’d rather open source AI I can run locally. even for something like GPT4 an enterprise-scale operation could afford the hardware

        • Uranium3006
          link
          fedilink
          311 months ago

          I’m aware of local AI but you need a really beefy GPU to run even the smaller models, and the good shit is good because it’s big

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      11 months ago

      If it’s made from all of us it should be free for all of us.

      I’m fine with these researchers going out and scraping the social networks to train models, it’s incredibly advantageous to society in general. But it’s gotta be crystal clear transparency and it’s gotta be limitlessly free to all who want to.

      It’s the only way that any of this won’t result in another massive boundary between the 1% and us pod living grunts. It’s already a devisively powerful technology when harnessed adversarially, that power is reduced when everyone has access to it as well.

    • @j4k3
      link
      English
      1
      edit-2
      11 months ago

      deleted by creator

  • @j4k3
    link
    English
    5511 months ago

    Open AI died the moment Meta’s Llama model weights were replicated completely open source. The outcome is guaranteed. It does not matter how much better the enormous proprietary model can be, people will never be okay with the level of intrusive data mining required for OpenAI or Google’s business model. Personal AI tech must be open source and transparent with offline execution. AI is the framework of a new digital economy, not the product.

    • TheEntity
      link
      fedilink
      8511 months ago

      people will never be okay with the level of intrusive data mining required for OpenAI or Google’s business model

      Where do you meet these people? I need more of such people in my life.

      • Riker_Maneuver
        link
        fedilink
        English
        2711 months ago

        Yeah, I was about to say, 99% of people are either unaware or do not care. Don’t mistake Lemmy’s privacy opinions as representative of the general population.

      • @[email protected]
        link
        fedilink
        711 months ago

        There’s currently a meeting in Germany with about 4000 of them.

        But it’s not the prevailing mindset in the general population.

    • @[email protected]
      link
      fedilink
      English
      1611 months ago

      AI is the framework of a new digital economy, not the product.

      That is one interesting sentence. Thanks.

    • krellor
      link
      fedilink
      411 months ago

      I don’t think it’s as much that the meta model was replicated as much as they fully open sourced it with a license for research and commercial use.

      I actually think the market demand will be fairly small for fully offline AI. The largest potential customers might be government who require full offline hosting, and there is a small group of companies servicing that niche. But even government customers who require that their data is segmented are simply having enclaves setup by the big cloud platforms where they guarantee that inputed data isn’t fed into the training process and doesn’t leave the customer environment.

      I fully support folks who sustain open source AI frameworks, but in terms of commercial customers that will drive industry trends with dollars, I expect there will be demand for hosted solutions that use proprietary models.

      • @[email protected]
        link
        fedilink
        311 months ago

        Yeah, but not models that are trained on data that raises copyright concerns, which is currently the case.

  • Maximilious
    link
    fedilink
    3211 months ago

    Have they asked the engine how they can generate more revenue?

    • Tigbitties
      link
      fedilink
      1511 months ago

      $700k/dsy. That’s $260 million per year. There are 360 million paid seats of MS360. So they’d have to raise their prices $0.73 per year to cover the cost. No one will notice.

    • The Prompt:
      How OpenAI can generate more revenue?

      The Response:
      OpenAI, the organization behind the development of AI models like GPT-3, has a few strategies to generate revenue:

      1. Licensing and Subscription Models: OpenAI can license its AI models and technologies to businesses, developers, and organizations for various applications. This could involve offering different pricing tiers based on usage, features, and support.

      2. API Access: OpenAI can offer access to its AI models through APIs (Application Programming Interfaces). Developers and companies can integrate these APIs into their products, applications, or services, paying for the amount of usage.

      3. Customized Solutions: OpenAI can collaborate with specific industries or companies to develop customized AI solutions tailored to their needs. These solutions can range from customer support bots to content creation tools.

      4. Premium Features: OpenAI could offer advanced features, improved performance, or specialized functionality as part of premium packages, enticing users to upgrade their subscriptions.

      5. Partnerships and Collaborations: OpenAI can collaborate with other tech companies, research institutions, or industries to jointly develop and commercialize AI applications.

      6. Consulting and Services: OpenAI could provide consulting services, training, and expertise in AI technology to help organizations integrate AI into their operations effectively.

      7. Training and Workshops: OpenAI can offer training programs, workshops, and educational resources to help individuals and businesses better understand and utilize AI.

      8. Data Services: OpenAI can offer datasets, annotations, and data-related services to researchers, businesses, and developers working on machine learning projects.

      9. Vertical-Specific Solutions: Developing AI solutions tailored for specific industries, such as healthcare, finance, or marketing, and offering them as industry-specific packages.

      10. Community Contributions: OpenAI can build a community around its AI models and offer a platform for users to share and monetize their own creations, benefiting from the broader ecosystem.

      It’s important to note that OpenAI’s approach to generating revenue will likely be influenced by factors such as market demand, ethical considerations, user feedback, and the evolving landscape of AI technologies and regulations. OpenAI aims to strike a balance between generating revenue and advancing its mission of ensuring AI benefits all of humanity.

  • @[email protected]
    link
    fedilink
    2711 months ago

    I really don’t understand how this is possible when Microsoft has invested billions into OpenAI

    • @atticus88th
      link
      1911 months ago

      All it takes is a couple of dudes to buy a couple of yachts, private planes maybe another home or two. And poof. Its gone.

    • @[email protected]
      link
      fedilink
      411 months ago

      Burning through billions of investors money isnt the same as being profitable. The Silicon Valley gravy train is over, and investors are actually demanding to start seeing returns on their investments.

  • donuts
    link
    fedilink
    2711 months ago

    AI as a business is already running on fumes, and it’s going to become even more expensive once intellectual property law catches up to them. We can only hope that the AI bubble bursting doesn’t take the entire market economy down with it…

      • @GlitzyArmrest
        link
        English
        811 months ago

        Didn’t the tech bubble burst already?

      • donuts
        link
        fedilink
        211 months ago

        I mean, I get you, but personally I don’t really like the idea of millions of innocent people losing their homes and most of their savings because some fucking dweebs decided to put all of our collective wealth in legally dubious automatic junk “content” generators. I’ve lived through enough crashes to know that it’s never the big guys that get fucked when everything goes tits up, it’s us, our parents, our grandparents, etc.

        • @[email protected]
          link
          fedilink
          English
          111 months ago

          Yeah status quo is the only reason to not throw caution to the wind and burn the whole thing down. It’s why nothing will ever get better.

    • 👁️👄👁️
      link
      fedilink
      English
      211 months ago

      Well it doesn’t help that ChatGPT is unoptimized as fuck with like 185b parameters for 3.5, and somewhere in the trillions for 4

  • @[email protected]
    link
    fedilink
    14
    edit-2
    11 months ago

    Sorry to say, I would take this with grain of salt. Not making profits is part of business model of these pioneering companies. Google, Amazon and Uber (etc) were in the negatives for so many years and they absorbed the losses in order to be the dominant brands where at the end users become dependent on them. At that point they’ll start to charge exorbitantly and forcefully add unneeded features that will exert more control upon their users but there’s nothing that they can do but pay, for the simple fact that they can’t do without them.

  • @[email protected]
    link
    fedilink
    1311 months ago

    MS will buy it all at the fire sale for cheap then integrate it deeply into Bing, Windows, etc

  • roguetrick
    link
    fedilink
    1011 months ago

    High interest rates baby. I noted this was happening when people were complaining about lowered quality because they were using less resource intensive operations.

  • 👁️👄👁️
    link
    fedilink
    English
    1011 months ago

    They also didn’t design ChatGPT to be power efficient at all, so that’s bloating up their operating costs a ton.

  • @[email protected]
    link
    fedilink
    English
    811 months ago

    It’s kinda hilarious how many people are swallowing this article without any salt whatsoever.

  • @[email protected]
    link
    fedilink
    English
    711 months ago

    If Twitter ran for decades on a loss, so will OpenAI. Worst case scenario they get completely absorbed by MS and have the bill footed by them. Kind of what happened with Youtube.