"The economics are likely to be grim. Sky high valuations are largely based on a fantasy.”

  • @Valmond
    link
    English
    1512 days ago

    Oh no.

    Anyways…

  • hendrik
    link
    fedilink
    English
    5
    edit-2
    12 days ago

    To add a bit more background:

    We already had two major AI winters: https://en.wikipedia.org/wiki/AI_winter

    More related articles:

    My opinion: We’re facing a lot of issues, energy, training data is finite, it might be likely that the current architecture of AI models will hit a ceiling. We already pump in lots of compute for ever diminishing returns. I’m pretty sure that approach won’t scale towards AGI. Like outlined in the article.

    But it doesn’t need to keep growing exponentially to be useful. AI is hyped to no end. And it’s a real revenue driver for companies. I’d say the bubble is over-inflated. Some people are bound to get disappointed. And in my eyes it’s very likely that it won’t keep growing at the current pace of the last two years. And ultimately we’d need to come up with some new inventions if we want AGI. As far as I know that’s still utter sci-fi. Nobody knows how to revolutionize AI so it’ll suddenly become 100x more intelligent. And it’s unlikely that our current approach will get us there. But on the other hand no-one ruled out there is a possibility to do it with a more clever approach. I’d lower my expectations. There has been a lot of hype and unfounded claims. Things take their time. And the normal way things go is gradual improvement. But it’s not a “grim” perspective either (like the author put it).

    And I agree there is still “quite a bit of growth” left in the AI market. Especially once we get more hardware than just the latest Nvidia graphics card, it’ll maybe make things more affordable and more adopted.

    […] a trend that we are seeing in that marketplace towards smaller models as the large foundation models are becoming quite expensive to build, train, and iterate on […]

    I think that’s a good thing. Ultimately this is about making things more efficient. Any maybe realizing you don’t need the same big model for every task. It surely democratizes things and allows people with consumer priced hardware to participate in AI.