• @Zarxrax
    link
    English
    225 months ago

    Well, hopefully this will at least force stability’s hand in some way and get them to at least make an official statement instead of just remaining silent.

    • @brucethemoose
      link
      English
      95 months ago

      TBH they don’t want anything to do with CivitAI, so maybe this is their way of sabotaging SD3 for it?

        • @j4k3
          link
          English
          45 months ago

          It has a lot of potential if the T5 can be made conversational. After diving into a custom DPM adaptive sampler, there is a lot more specificity required. I believe the vast majority of people are not using the model with the correct workflow. Applying the old model workflows to SD3 makes garbage results. The 2 CLIPS models and the T5 need separate prompts, and the negative prompt needs an inverted channel with a slight delay before reintegration. I also think the smaller quantized version of the T5 is likely the primary problem overall. Any Transformer text model that small, that is them quantized to extremely small size is problematic.

          The license is garbage. The company is toxic. But the tool is more complex than most of the community seems to understand. I can generate a woman lying on grass in many intentional and iterative ways.

          • @brucethemoose
            link
            English
            25 months ago

            Yeah, and it’s just fp8 truncation right? Not actual “smart” quantization? That’s even a big hit for huge decoder-only llms.