• @[email protected]
    link
    fedilink
    English
    18 hours ago

    You need a lot of VRAM and a large visual model for higher complexity.

    Lower VRAM means you run models that only do one thing consistently/well.

    See: FLUX

    • @bradd
      link
      English
      17 hours ago

      I have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?