Would you like to see some plugins which integrate with local/self-hosted AI instead of sending it to ChatGPT? Or don’t you care about privacy there as long as the results are good?

You might be interested in GPT4All (https://gpt4all.io/index.html), which can be easily downloaded as Desktops GUI. Simply download a model (like nous Hermes) for 7.5GB and run in even without a GPU right on your CPU l (albeit slightly slowish).

It’s amazing what’s already possible with local AI instead of relying on large scale expensive and corporate-dependent AIs such as ChatGPT

  • @[email protected]OP
    link
    fedilink
    21 year ago

    hopeful that we won’t have to rely entirely on commercial GPTs in the future.

    This is very much the case. People have been calling for a stable diffusion equivalent of DallE2 for LLMs, and since Llama was leaked a few months back, the improvements on open source LLMs have been impressive. Like it took 3 weeks for the open source Llama adaptations to catch up to Googles Bard AI!

    Check this well written internal Google memo out: https://www.semianalysis.com/p/google-we-have-no-moat-and-neither

    I find the developments impressive and nice. I’m looking forward for how far the local, private and open-source will be in just a year from now!

    • @[email protected]
      link
      fedilink
      11 year ago

      Yeah I am quite excited to try out GPT4ALL or something similar as soon as time allows it, integrated as a plugin or not. To me it’s especially impressive that small models are inching up to the big ones just by being trained on better/gpt-generated data. Seeing it run on modern and even older consumer hardware is mind-boggling. This was all so far away before chatgpt and still quite far away with chatgpt.