Would you like to see some plugins which integrate with local/self-hosted AI instead of sending it to ChatGPT? Or don’t you care about privacy there as long as the results are good?

You might be interested in GPT4All (https://gpt4all.io/index.html), which can be easily downloaded as Desktops GUI. Simply download a model (like nous Hermes) for 7.5GB and run in even without a GPU right on your CPU l (albeit slightly slowish).

It’s amazing what’s already possible with local AI instead of relying on large scale expensive and corporate-dependent AIs such as ChatGPT

  • @[email protected]
    link
    fedilink
    31 year ago

    I am generally interested in giving an LLM more context about my data and hobby project code, but I’d never give someone else such deep access. GPT4ALL sounds great and it’s making me hopeful that we won’t have to rely entirely on commercial GPTs in the future. It’s the AI equivalent to what Linux and FreeBSD are to OSs.

    But that still leaves the question of what to do with it. I see 2 main purposes:

    • Asking the GPT questions about your material
    • Writing more content for your vault

    That both seems useful at first, but I don’t think it’s really necessary. A good fuzzy search like obsidian has and a good vault and note structure makes the first point pretty irrelevant.

    Also, writing more content is really two things:

    • Text generation/completion
    • Research

    I think a plugin might be nice and user friendly UI for the first point, but research is much better done in a chat-like environment. And for that I don’t need an integration, as I probably have a web browser open anyways.

    • @[email protected]OP
      link
      fedilink
      21 year ago

      hopeful that we won’t have to rely entirely on commercial GPTs in the future.

      This is very much the case. People have been calling for a stable diffusion equivalent of DallE2 for LLMs, and since Llama was leaked a few months back, the improvements on open source LLMs have been impressive. Like it took 3 weeks for the open source Llama adaptations to catch up to Googles Bard AI!

      Check this well written internal Google memo out: https://www.semianalysis.com/p/google-we-have-no-moat-and-neither

      I find the developments impressive and nice. I’m looking forward for how far the local, private and open-source will be in just a year from now!

      • @[email protected]
        link
        fedilink
        11 year ago

        Yeah I am quite excited to try out GPT4ALL or something similar as soon as time allows it, integrated as a plugin or not. To me it’s especially impressive that small models are inching up to the big ones just by being trained on better/gpt-generated data. Seeing it run on modern and even older consumer hardware is mind-boggling. This was all so far away before chatgpt and still quite far away with chatgpt.

    • gelberhut
      link
      fedilink
      21 year ago

      For me the significant part of the notetaking value is the fact that I create the text of the note myself. This process also helps me to remember things better. AI generated note is mostly useless for me.

      Extrac data…probably, but hard to imagine something more impressive than search on steroids.