I was curious, do you run Stable Diffusion locally? On someone else’s server? What kind of computer do you need to run SD locally?

  • TheForvalaka
    link
    fedilink
    51 year ago

    I run it locally. I prefer having the most control I can over the install, what extensions I want to use, etc.

    The most important thing to run it in my opinion is VRAM. The more the better, as much as you can get.

    • @[email protected]
      link
      fedilink
      21 year ago

      I run locally too. I have a 10gb 3080.

      I haven’t had vram issues could you elaborate on your statement?

      I know on local llama I have been limited to 13b models

      • TheForvalaka
        link
        fedilink
        21 year ago

        Stable Diffusion loves VRAM. The larger and more complex the images you’re trying to produce, the more it’ll eat.

        My line of thinking is that if you have a slower GPU it’ll generate slower, sure, but if you run out of VRAM it’ll straight up fail and shout at you.

        I’m not an expert in this field though, so grain of salt, YMMV, all that.

      • @[email protected]
        link
        fedilink
        11 year ago

        I know on local llama I have been limited to 13b models

        You can run llama.cpp on the CPU with reasonable speeds making full use of normal RAM to run much larger models.

        As for 10GB in SD, I run into lack of VRAM quite constantly when overdoing it, e.g. 1024x768 with multiple ControlNets and some other stuff is pretty much guaranteed to overflow it. I have to reduce the resolution when making use of ControlNet. Dreambooth training didn’t even work at all for me due to lack of VRAM (might be possible to work around, but at least the defaults weren’t usable).

        10GB is still very much usable with SD, but one has to be aware of the limitations. The new SDXL will also increase the VRAM requirements a good bit.