Wondering about services to test on either a 16gb ram “AI Capable” arm64 board or on a laptop with modern rtx. Only looking for open source options, but curious to hear what people say. Cheers!

  • moomoomoo309
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    Your M.2 port can probably fit an M.2 to PCIe adapter and you can use a GPU with that - ollama supports AMD GPUs just fine nowadays (well, as well as it can, rocm is still very hit or miss)

      • moomoomoo309
        link
        fedilink
        English
        arrow-up
        1
        ·
        15 hours ago

        Check rocm’s supported cards, oh and after you install rocm, restart your computer - made that mistake when I was doing it and couldn’t figure out why it wasn’t working.