• @bassomitron
    link
    English
    910 months ago

    Would that actually be decent? Even 6b models feel way too rudimentary after experiencing 33+b models and/or chatgpt. I haven’t tried those really scaled down and optimized models, though!

    • @acec
      link
      210 months ago

      Decent enough for a model 50 times smaller than ChatGPT. I use orca_mini_3b.

    • @[email protected]
      link
      fedilink
      210 months ago

      They’re decent for text completion purposes, e.g. generating some corpspeak for an email, or generating some “wikipedia”-like text. You have to know how to write good prompts, don’t try to treat it like ChatGPT.

      For example if i want to know about the history of Puerto Rico I would put:

      “The history of puerto rico starts in about 480BC when”