What are the hardware requirements to run SDXL?

In particular, how much VRAM is required?

This is assuming A1111 and not using --lowvram or --medvram.

  • Altima NEO
    link
    fedilink
    English
    1
    edit-2
    1 year ago

    I can run it on my 3080 10 gig card, but Its ridiculously slow. I HAVE to use --medvram or I get out of memory errors and NaN errors. And I mean ridiculously slow. Loading the model takes a few minutes. Generating an image requires me to minimize the browser window, or stable diffusion just stalls. Switching to the refiner isnt even an option because it takes so long to switch between models.

    This is on a 5930K, 32 GB Ram, 3080 10G trying to generate 1024x1024 images.

    However with comfyUI, it runs just fine, PC doesnt struggle, and it generates the images in about 40 seconds at 50 steps base, 10 refiner.