Abacus.ai:

We recently released Smaug-72B-v0.1 which has taken first place on the Open LLM Leaderboard by HuggingFace. It is the first open-source model to have an average score more than 80.

  • FenrirIII
    link
    English
    21 year ago

    OOTL: What is a LLM and what does it do?

    • @saltesc
      link
      English
      221 year ago

      Large Language Model AI. Like ChatGPT.

      • FaceDeer
        link
        fedilink
        01 year ago

        And at 72 billion parameters it’s something you can run on a beefy but not special-purpose graphics card.

        • @glimse
          link
          English
          61 year ago

          Based on the other comments, it seems like this needs 4x as much ram than any consumer card has

          • FaceDeer
            link
            fedilink
            41 year ago

            It hasn’t been quantized, then. I’ve run 70B models on my consumer graphics card at a reasonably good tokens-per-second rate.

          • DarkThoughts
            link
            fedilink
            21 year ago

            I’m curious how local generation goes with potentially dedicated AI extensions using stuff like tensor cores and their own memory instead of hijacking parts of consumer GPUs for this.