• John Richard
    link
    English
    -424 hours ago

    But will it have enough RAM

    • @RageAgainstTheRich
      link
      English
      1024 hours ago

      It says 16gb of vram in the first line of the article. My 8gb kills me. Its a beast of a card, buy as soon as i go over the vram limit, it slows to a crawl.

      • John Richard
        link
        English
        -523 hours ago

        Their top tier 7800 XTX had 24GB. Most AI models need at least 24 but preferably 32. Guess they don’t need to try when NVIDIA isn’t either, despite not being very expensive to do so.

        • Sickday
          link
          fedilink
          922 hours ago

          Most AI models need at least 24 but preferably 32.

          Where are you getting this information from? Most models that are less than 16B params will run just fine with less than 24 GB of VRAM. This github discussion thread for open-webui (a frontend for Ollama) has a decent reference for VRAM requirements.

          • John Richard
            link
            English
            -2
            edit-2
            15 hours ago

            I should have been more specific. The home models that actually compete with paid ones in both accuracy & speed. Please don’t be one of those to exaggerate & pretend it works just as good with much less. It simply doesn’t.

    • @[email protected]
      link
      fedilink
      English
      115 hours ago

      Maybe with CXL or infinityfabric that won’t matter as much.

      Don’t know what they’re support looks like on the consumer side though