• @j4k3
    link
    English
    1411 months ago

    Llama-2 because you can run it on your own hardware. For the big GPU on a rented instance: Falcon 70b. OpenAI and Google can have turns playing proprietary asshat jack in the box.

    • @procrastinator
      link
      English
      111 months ago

      How expensive would it be to run on a rented server?

      • ChickenBoo
        link
        fedilink
        English
        6
        edit-2
        11 months ago

        For the bigger ones you could do it under $.50 / hr. I run llama 2 13b-8 bit on my 3090 no problem, which can be rented for $.20/hr.

        https://vast.ai/#pricing

        Some of the lower pricing I’ve seen.

        • @[email protected]
          link
          fedilink
          English
          411 months ago

          How do you get into doing this? Like, are you a data scientist? AI Engineer? Hobbyist?

          I want to be able to use tools like this, and I want a formal education in it, but I’m not sure what path to take. (I don’t necessarily want a career in it…my current career is pretty fantastic, but I think understanding tools like this is extremely important in understanding the modern world. That’s my main motivation.)