• @LesserAbe
    link
    15 months ago

    That’s cool. Am I reading right that this wouldn’t run on consumer grade hardware though?

    • @TechNerdWizard42
      link
      45 months ago

      I believe you’d need roughly 500GB of RAM to run it minimum at full context length. There is chatter that 125k context took and used 40GB

      I know I can load the 70B models into my laptop at lower bits but it consumes about 140GB of RAM.

    • @[email protected]
      link
      fedilink
      35 months ago

      It is llama3-8B so it is not out of question but I am not sure how much memory you would need to really go to 1M context window. They use ring attention to achieve high context window, which I am unfamiliar with but that seems to lower greatly the memory requirements.