• @[email protected]
    link
    fedilink
    English
    35 months ago

    We’re in the “computers take up entire rooms in a university to do basic calculations” stage of modern AI development. It will improve but only if we let them develop.

    • @[email protected]
      link
      fedilink
      English
      2
      edit-2
      5 months ago

      Moore’s law died a long time ago, and AI models aren’t getting any more power efficient from what I can tell.

      • @[email protected]
        link
        fedilink
        English
        35 months ago

        Then you haven’t been paying attention. There’s been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.

      • @someacnt_
        link
        English
        15 months ago

        Yeah, and improvements will require paradigm changes. I don’t see that from GPT.

          • @someacnt_
            link
            English
            15 months ago

            Are there LLMs with different paradigms?

            • @[email protected]
              link
              fedilink
              English
              25 months ago

              GPT is not a paradigm it’s a specific model family developed by openAI. You’re thinking of the transformers architecture. Check out a project like RWKV if you want to see a unique approach.