cross-posted from: https://programming.dev/post/177822

It’s coming along nicely, I hope I’ll be able to release it in the next few days.

Screenshot:

How It Works:

I am a bot that generates summaries of Lemmy comments and posts.

  • Just mention me in a comment or post, and I will generate a summary for you.
  • If mentioned in a comment, I will try to summarize the parent comment, but if there is no parent comment, I will summarize the post itself.
  • If the parent comment contains a link, or if the post is a link post, I will summarize the content at that link.
  • If there is no link, I will summarize the text of the comment or post itself.

Extra Info in Comments:

Prompt Injection:

Of course it’s really easy (but mostly harmless) to break it using prompt injection:

It will only be available in communities that explicitly allow it. I hope it will be useful, I’m generally very satisfied with the quality of the summaries.

  • Wander@yiffit.net
    link
    fedilink
    arrow-up
    5
    ·
    2 years ago

    I hope it does not get too expensive. You might want to have a look at locally hosted models.

    • 𝕊𝕚𝕤𝕪𝕡𝕙𝕖𝕒𝕟OP
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      2 years ago

      Unfortunately the locally hosted models I’ve seen so far are way behind GPT-3.5. I would love to use one (though the compute costs might get pretty expensive), but the only realistic way to implement it currently is via the OpenAI API.

      EDIT: there is also a 100 summaries / day limit I built into it to prevent becoming homeless because of a bot

      • Wander@yiffit.net
        link
        fedilink
        arrow-up
        2
        ·
        2 years ago

        By the way, in case it helps, I read that OpenAI does not use content submitted via API for training. Please look it up to verify, but maybe that can ease the concerns of some users.

        Also, have a look at these hosted models, they should be way cheaper than OpenAI. I think that this company is related to StabilityAI and the guys from StableDiffusion and also openassistant.

        https://goose.ai/docs/models

        There is also openassistant, but they don’t have an API yet. https://projects.laion.ai/Open-Assistant

        • 𝕊𝕚𝕤𝕪𝕡𝕙𝕖𝕒𝕟OP
          link
          fedilink
          arrow-up
          1
          ·
          2 years ago

          Yes, they have promised explicitly not to use API data for training.

          Thank you, I’ll take a look at these models, I hope I can find something a bit cheaper but still high-quality.

    • rastilin@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      2 years ago

      I looked into using locally hosted models for some personal projects, but they’re absolutely awful. ChatGPT 3.5, and especially 4 are miles above what local models can do at the moment. It’s not just a different game, the local model is in the corner not playing at all. Even the Falcon instruct, which is the current best rated model on HuggingFace.

      • Biscuit@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        2 years ago

        ChatGPT 3.5 is free, with login, but there it’s no free API access, for bots. It’s about $0.002/1k tokens. So something like 600 tldr per $.