It’s coming along nicely, I hope I’ll be able to release it in the next few days.

Screenshot:

How It Works:

I am a bot that generates summaries of Lemmy comments and posts.

  • Just mention me in a comment or post, and I will generate a summary for you.
  • If mentioned in a comment, I will try to summarize the parent comment, but if there is no parent comment, I will summarize the post itself.
  • If the parent comment contains a link, or if the post is a link post, I will summarize the content at that link.
  • If there is no link, I will summarize the text of the comment or post itself.

Extra Info in Comments:

Prompt Injection:

Of course it’s really easy (but mostly harmless) to break it using prompt injection:

It will only be available in communities that explicitly allow it. I hope it will be useful, I’m generally very satisfied with the quality of the summaries.

  • @rr7
    link
    English
    51 year ago

    I love it! 👑👑

  • @[email protected]
    link
    fedilink
    English
    31 year ago

    I’m always curious about using GPT like that. Does it cost money to send requests like this into GPT?

    • 𝕊𝕚𝕤𝕪𝕡𝕙𝕖𝕒𝕟OPM
      link
      fedilink
      English
      41 year ago

      It does unfortunately, see here:

      https://openai.com/pricing

      I limited it to 100 summaries / day, which adds up to about $20 (USD) per month if the input is 3000 tokens long and the answer is 1000.

      Using it for personal things (I buildt a personal assistant chatbot for myself) is very cheap. But if you use it in anything public, it can get expensive quickly.

      • @[email protected]
        link
        fedilink
        English
        51 year ago

        Have you considered using a self-hosted instance of GPT4All? It’s not as powerful, but for something like summarizing an article it could be plenty - And importantly, much, much cheaper.

        • 𝕊𝕚𝕤𝕪𝕡𝕙𝕖𝕒𝕟OPM
          link
          fedilink
          English
          11 year ago

          I haven’t yet looked into it, but the screencast on its website looks really promising! I have a lot on my plate right now so I think I’ll release it first with the GPT-3.5 integration, but I’ll definitely try GPT4All later!

          • @[email protected]
            link
            fedilink
            English
            21 year ago

            It’s just an open source LLM that you can run on your own hardware; I haven’t looked into it a ton tbh - but if I saw $20/month for 100 requests/day I’d be immediately looking for a way to run it on my own hardware

            • @[email protected]
              link
              fedilink
              English
              21 year ago

              Its kinda debatable, while yes I wouldnt wanna pay that either, ive been following the local working llms, Gpt4all stroke me as not bad but not all that special or amazing (compared to 2021 there all magic though) the naming seems a very little bit misleading with gpt4 as the world most advanced known model. All the models on tbe huggingface page i send work can work locally but at best there gpt-3 level competitors.

  • @[email protected]
    link
    fedilink
    English
    21 year ago

    This is a great idea! What if there’s a post about, say, a movie review, then it includes a link to the movie’s imdb or letterboxd. Would it summarized the link instead of the review?

    • 𝕊𝕚𝕤𝕪𝕡𝕙𝕖𝕒𝕟OPM
      link
      fedilink
      English
      31 year ago

      It’s a Node.js app because the Lemmy-bot library is for Node.

      I will definitely open source it, but the code is currently in a disgusting state, so I need to clean it up first.