• @[email protected]
    link
    fedilink
    English
    17 months ago

    you can, but things as good as chatgpt can’t be ran on local hardware yet. My main obstacle is language support other then english

    • @[email protected]
      link
      fedilink
      English
      27 months ago

      They’re getting pretty close. You only need 10GB VRAM to run Hermes Llama2 13B. That’s within the reach of consumers.

      • @[email protected]
        link
        fedilink
        English
        17 months ago

        nice to see! i’m not following the scene as much anymore (last time i played around with it was with wizard mega 30b). definitely a big improvement, but as much as i hate to do this, i’ll stick to chatgpt for the time being, it’s just better on more niche questions and just does some things plain better (gpt4 can do maths (mostly) without hallucinating)