• Scoopta
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    3 months ago

    Ollama is also a cool way of running multiple models locally

    • Retro_unlimited@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      That might be the other one I run, I forget because it’s on my server as a virtual machine (rtx 3080 pass through), but I haven’t used it in a long time.