OpenAI has nothing. There’s no new tech breakthroughs and it’s still hemorrhaging cash by the billion. Something they could brand “GPT-5” was due mid-2024. How to placate the investors …
Nah the funding is starting to dry up already. Big players including microsoft are cancelling plans. Ofcourse the scams keep going for a while but the peak is in the past.
I don’t think the bubble will pop as long as “Ai” keeps advancing at the pace it is. LLMs, text to photo, and text to video models are getting exponentially better year over year. I really don’t think the hype train is going to slow down until the rate of progress slows as well, so far there aren’t an indications the rate of progress is going to slow.
Wild guess here that I’m sure you and others will disagree with, even when the bubble “pops” it won’t really be a pop but more of a downturn that doesn’t actually hurt any of the big players significantly.
Only time will tell, it will certain be interesting to watch as an outsider no-matter what.
Edit: As many have pointed out, using the word exponential was incorrect. Although I still stand by that I don’t think the bubble’s going to pop anytime soon, and these models are getting significantly better year over year. I would argue that text-to-video models have actually had exponential improvements, at least in the past year or two, but the other category’s, you’re alright, not so much.
This article is literally about the fact that progress has stagnated…
There are clearly fundamental issues with the approach they have been using for LLMs. There is no new data left, the entire internet has been scraped already. All thats left is incremental improvements in the way they process the data.
This article is literally about the fact that progress has stagnated…
OpenAI is stagnating, and has been for at least a few months, now. The AI industry as a whole has only continued to accelerate, especially with the new blood that is DeepSeek coming into play.
In my experience they’ve significantly tailed off over the past year, exponential growth would mean the amount they get better per unit time increases over time. What has gotten better is our ability to run the same level of things on cheaper hardware with less power, again just in my limited experience. (Also this is not the definition of exponential growth, just a property of it. Polynomial growth has the same property)
Has cloud LLM quality really improved exponentially in the past 12-18 months?
I use a mix of local and cloud LLMs and to be honest my use cases are relatively simple, so it’s difficult for me to say.
There is also the issue of running out of real training data.
Not necessarily disagreeing with you (just look how long crypto has lasted and they’ve never been able to go beyond degenerate financial speculation and criminal activities).
The bubble is bursting, i hope Nvidia implodes in a never before seen spectacular fashion.
You are being overzealous, the bubble ain’t gonna pop for a long while imo
Nah the funding is starting to dry up already. Big players including microsoft are cancelling plans. Ofcourse the scams keep going for a while but the peak is in the past.
The investment bubble will pop the way it did for the Internet back in 2000. It doesn’t mean the Internet went away or even that it stopped growing.
I don’t think the bubble will pop as long as “Ai” keeps advancing at the pace it is. LLMs, text to photo, and text to video models are getting
exponentiallybetter year over year. I really don’t think the hype train is going to slow down until the rate of progress slows as well, so far there aren’t an indications the rate of progress is going to slow.Wild guess here that I’m sure you and others will disagree with, even when the bubble “pops” it won’t really be a pop but more of a downturn that doesn’t actually hurt any of the big players significantly.
Only time will tell, it will certain be interesting to watch as an outsider no-matter what.
Edit: As many have pointed out, using the word exponential was incorrect. Although I still stand by that I don’t think the bubble’s going to pop anytime soon, and these models are getting significantly better year over year. I would argue that text-to-video models have actually had exponential improvements, at least in the past year or two, but the other category’s, you’re alright, not so much.
it is 2025 how are you still saying this shit
The Information’s Dealmaker newsletter is talking today about downrounds in AI venture funding
LLMs, text to photo, and text to video models are getting logarithmically better year over year.
This article is literally about the fact that progress has stagnated…
There are clearly fundamental issues with the approach they have been using for LLMs. There is no new data left, the entire internet has been scraped already. All thats left is incremental improvements in the way they process the data.
OpenAI is stagnating, and has been for at least a few months, now. The AI industry as a whole has only continued to accelerate, especially with the new blood that is DeepSeek coming into play.
I hear ethereum is going to solve all of bitcoin’s problems
Wait until we get the Strategic Ape Reserve, then you’ll see!
irl flinched
In my experience they’ve significantly tailed off over the past year, exponential growth would mean the amount they get better per unit time increases over time. What has gotten better is our ability to run the same level of things on cheaper hardware with less power, again just in my limited experience. (Also this is not the definition of exponential growth, just a property of it. Polynomial growth has the same property)
Has cloud LLM quality really improved exponentially in the past 12-18 months?
I use a mix of local and cloud LLMs and to be honest my use cases are relatively simple, so it’s difficult for me to say.
There is also the issue of running out of real training data.
Not necessarily disagreeing with you (just look how long crypto has lasted and they’ve never been able to go beyond degenerate financial speculation and criminal activities).