• massi1008@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    4 days ago

    Privacy & Local Storage

    All your chats and data are stored locally on your device, ensuring complete privacy and control.

    Assuming that those providers don’t store your chats (which they certainly do). Also no mention of locally hosted models?

    If you just want to connext to llms then SillyTavern can also do that, for free.

    • yukaiiOP
      link
      fedilink
      arrow-up
      7
      arrow-down
      6
      ·
      5 days ago

      Basically you can talk to different LLM models through their api. But all your conversation are saved locally on your computer so you don’t have to pay a subscription fee and you have all your chats in one place!

    • yukaiiOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      11
      ·
      edit-2
      5 days ago

      No, the application isn’t open source, primarily because I plan to actively support and maintain it for many years to come. But it’s a one-time payment so you don’t have to pay anything after that!

        • yukaiiOP
          link
          fedilink
          arrow-up
          5
          arrow-down
          8
          ·
          5 days ago

          How would I go about doing that though? If I give the users an open source license on purchase, compilation and reselling will be pretty easy to do and I don’t want that to happen for something I worked very hard on.

  • truxnell@aussie.zone
    link
    fedilink
    arrow-up
    6
    ·
    5 days ago

    Looks a lot likehttps://github.com/open-webui/open-webui

    I’m using it with open router, helps me claw back a little privacy

    • yukaiiOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      4
      ·
      5 days ago

      Currently only Windows is supported. But because it’s flutter under the hood, I will be expanding to other platforms shortly!

  • enemenemu@lemm.ee
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    5 days ago

    I love the gtk ui. I wish it was a frontend for a local ai but I’m not complaining :) i love it still. Kudos!

    • yukaiiOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      5 days ago

      Thank you! It could actually be a frontend for local ai. Currently only Openai, Anthropic and Google are supported. But I will be adding support for lmstudio, ollama etc in the future as well!

  • deadcatbounce@reddthat.com
    link
    fedilink
    arrow-up
    5
    arrow-down
    4
    ·
    5 days ago

    I applaud this program, exactly what we’ve all been looking for.

    Looks like many of the comments identifying it have been removed.

  • yukaiiOP
    link
    fedilink
    arrow-up
    6
    arrow-down
    9
    ·
    5 days ago

    The goal was to build a “native” app that has a nice UI and can connect to all the different LLM providers. So basically you don’t have to pay a subscription fee and only pay as you go to the different providers.

    The difference to existing apps like Typing Mind is that it actually mimics the UI from ChatGPT as close as possible. So you don’t feel like you’re using something inferior. What do you guys think?