• @imaging2162
    link
    English
    22 months ago

    How was this done? It’s pretty good!

    • @j4k3OP
      link
      English
      22 months ago

      It is just a simple prompt in Flux on Comfy UI. It is just an open source model running on my hardware. It is slow because it is such a large model (Flux Dev gguf Q4). You can find examples in the ComfyUI documentation and the model manager add-on to the base Comfy setup has all the models in the downloads menu.

      At present, it only works on GPU and 16 GB is like 2+ minutes per image. It would be awesome in they split the chunks with the CPU to generate faster, but that is not implemented yet. It means you basically need 16+ GB to run it on your own hardware. There is a smaller model version, but that is not compatible in results quality. There is a larger model that is online only. Flux is actually FluX as in X-AI as in Musk. The weights for Flux-dev are open source, and that is what I care about for now.