• @Valmond
    link
    22 months ago

    No problem, but I mean if you’re just tinkering around then you could do with even less memory as long as the model stays in it and you sample small pieces in small batches.

    We all had P series gpus and we had to buy up because the trainees model didn’t fit in 16GB (they had probably too much money) so I don’t remember what card it was for the 24GB.

    • @SomeGuy69
      link
      02 months ago

      For just tinkering around one could use SD1.5 with a 4GB VRAM GPU and stop after a few minutes. I spend quite some time on AI image generation, like on average 4 hours per day since over a year now. New models, especially video AI generation will need more VRAM, but since I don’t do this commercially, I can’t just pay 30k for a GPU.