so it’s GANAM now (from GAFAM or GAMAM)

    • @[email protected]
      link
      fedilink
      English
      89 months ago

      LLMs are a bubble.

      But the uses of massively parallel math are still in their infancy. Scientific compute, machine learning, all kind of different simulations. Nvidia has been setting themselves up for all of it with cuda for years. At least until we get better options to physically replicate neurons (primarily how interconnected they are in a brain), GPUs and cuda specifically are how most AI is going to happen. And as the power increases, the ability to do increasing complex physics simulations of increasingly complex phenomena is going to become more and more relevant. Right now, it’s stuff like proton folding, fluid dynamics, whatever. But there’s way more coming. And all of it is going to use GPUs.

  • @[email protected]
    link
    fedilink
    English
    79 months ago

    Can infinite growth be real? Ask a supposed market expert and a kid, you will get different answers and only on is correct.

  • Dave
    link
    English
    49 months ago

    Is the answer no? It’s no, isn’t it?

    • @Telodzrum
      link
      English
      19 months ago

      Yeah, of course it is. It’s not a trend but an outlier, that odd how these things work.

  • @filister
    link
    English
    49 months ago

    Dotcom bubble part 2