First Post (but have loved playing along)

  • @ForgottenUsernameOP
    link
    4
    edit-2
    2 months ago

    Consider the average toaster, roughly 1100W (on average) and toast takes 1-4 min to cook, (for the purposes of this we’ll split the difference and say 2 minutes).
    With math, toasting 1 slice of bread equates to roughly 0.037kWh of electricity. (kWh = (watts × hours) ÷ 1000)

    Now I’m running a 7900XTX (OC) who’s peak power draw is 800W (300W less than a toaster), and it legit takes 5-10secs to generate an image. Realistically I might do a couple of runs (some small then one big one) and use 30 secs of peak compute time. This would equate to 0.0067kWh of electricity usage.

    Toasting bread quite literally draws way more electricity than it takes for me to generate one AI image.

    1000025193

    So are you out there hassling people cooking their morning toast for thier criminally high power usage?

    Also some further context for you, I don’t use Stable Diffusion XL (listed in your article) as the old school 512x512 is more than enough for my needs (as demonstrated in this post^^). Your second article is paywalled, (not great to share if ppl can’t access it), but appears to be data center use which as described above is not what I’m doing here.

    edit, spelling