Wondering about services to test on either a 16gb ram “AI Capable” arm64 board or on a laptop with modern rtx. Only looking for open source options, but curious to hear what people say. Cheers!

    • state_electrician@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Yeah. I have a mini PC with an AMD GPU. Even if I were to buy a big GPU I couldn’t use it. That frustrates me, because I’d love to play around with some models locally. I refuse to use anything hosted by other people.

      • moomoomoo309
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Your M.2 port can probably fit an M.2 to PCIe adapter and you can use a GPU with that - ollama supports AMD GPUs just fine nowadays (well, as well as it can, rocm is still very hit or miss)

          • moomoomoo309
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 hours ago

            Check rocm’s supported cards, oh and after you install rocm, restart your computer - made that mistake when I was doing it and couldn’t figure out why it wasn’t working.

    • kiol@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      Well, let me know your suggestions if you wish. I took the plunge and am willing to test on your behalf, assuming I can.