• (des)mosthenesOP
    link
    99 months ago

    it works on apple arm architecture surprisingly well; near instantly typically

  • @[email protected]
    link
    fedilink
    7
    edit-2
    9 months ago

    Where did this come from? Did OpenAI finally release the source code? And where does the training data come from? Is the training data public domain/appropriately licensed?

    • (des)mosthenesOP
      link
      19 months ago

      that’s all on the site - openai offers an api for openai model access

    • @[email protected]
      link
      fedilink
      139 months ago

      No, a GPU would be ideal but not everyone has one, especially one with enough VRAM. I have an AMD card with 12gb of VRAM and I can run 7B - 13B models but even the 7B models (which seems to be the lowest that is still good) use a little more than 8gb of VRAM and most people probably have an Nvidia card with 8gb or less. 13B models get very close to using the full 12gb.

    • @DoYouNot
      link
      109 months ago

      Not everyone has a dedicated GPU, I would guess. GPUs are good at doing tensor calculations, but they’re not the only way.

    • Sabata11792
      link
      fedilink
      29 months ago

      Its better if you have a good GPU, but will not run without a card from the last few years. It can run on CPU but it’s much slower.

  • @PlutoniumAcid
    link
    79 months ago

    There’s a long listing of models to choose from. How to pick one? What are differences, benefits/drawbacks?

    • (des)mosthenesOP
      link
      19 months ago

      it’s gonna take experimentation; there’s a list of all models in the app or on the site and maybe a little googling. you can still use openai too. mistral is solid overall though; good for programming