• Lucy :3
    link
    fedilink
    110 hours ago

    If you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.

    • @[email protected]
      link
      fedilink
      English
      210 hours ago

      I have a ryzen 5 laptop. not really decent enough for that workload. and im not crazy about AI.

      • Lucy :3
        link
        fedilink
        110 hours ago

        I bet even my Pi Zero W could run such a model*

        * with 1 character per hour or so

        • @[email protected]
          link
          fedilink
          English
          210 hours ago

          interesting, well it’s something to look into, but id like a place to communicate with like minded people.