Ziggurat to AI Generated [email protected] • 1 day agoLadies and gentleman, this is your catpainfedia.ioimagemessage-square10fedilinkarrow-up122arrow-down11
arrow-up121arrow-down1imageLadies and gentleman, this is your catpainfedia.ioZiggurat to AI Generated [email protected] • 1 day agomessage-square10fedilink
minus-square@[email protected]linkfedilinkEnglish1•8 hours agoYou need a lot of VRAM and a large visual model for higher complexity. Lower VRAM means you run models that only do one thing consistently/well. See: FLUX
minus-square@braddlinkEnglish1•7 hours agoI have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?
You need a lot of VRAM and a large visual model for higher complexity.
Lower VRAM means you run models that only do one thing consistently/well.
See: FLUX
I have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?