Hello, I am currently using codename goose as an AI client to proofread and help me with coding. I have it setup towards Googles Gemini, however I find myself quickly running out of tokens with large files. I was wondering if there are any easy way to self host an AI with similar capabilites but still have access to read and write files. I’ve tried both ollama and Jan, but neither have access to my files. Any recommendations?

        • youreusingitwrongOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Just a 1080, though it handles just fine with 7b models, could also work with a 14b probably.

          • webghost0101@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            With sincere honesty i doubt a 7B model will grant you much coherent/usefull results. 14b won’t either.

            I can run deepseek 30b on a 4070ti super and i am very not impressed. I can do more but its too slow. 14b is optimal speed size balance.

            I am used to clause opus pro though which is one of the best.

            You are 100% allowed to proof me wrong. In fact i hope you do and build something small and brilliant but i personally recommend adjusting expectations and upgrading that card.