Just why. This was like the third time it reminded me that I shouldn’t have the phone I have.

I really don’t understand the public desire to make these AIs like this.

  • El Barto
    link
    English
    41 year ago

    Can we run our own version of these llms?

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      1 year ago

      Check out Llama cpp sometime, it’s foss and you can run it without too crazy system requirements. There are different sets of parameters you can use for it, I think the 13 billion parameters set only uses 8 gigs of ram and the 25b parameter set uses 16gb. Definitely nowhere near as good as gpt, but still fun and hopefully will improve in the future.

      • El Barto
        link
        English
        31 year ago

        It worked. Thanks!