Well…I mean…that’s kind of bound to happen when you draw 600W into a device that size I suppose. I feel like they’ve had this issue with every *090 card, whether it be cables or otherwise.
Running 600W for 12 hours a day at $0.10 per kWh costs $0.72 a day or $21.60 a month. Heat pumps can move 3 times as much heat as the electricity they consume, so roughly another $7.20 for cooling.
All electronics double as space heaters, there’s only a minuscule amount of electricity that’s not converted to heat.
My +10 year old GTX780 would pull 300W at full tilt, and it has only a ridiculous fraction of the compute power. The radeon 6990 would pull +400W… high end GPUs have been fairly power hungry for literally more than a decade.
Haha yeah I mistyped the years, it was supposed to be +10 and not +20…nevertheless these cards have been pulling at least 3-400W for the past 15 years.
As it so happens around a decade ago there was period when they tried to make Graphics Cards more energy efficient rather than just more powerful, so for example the GTX 1050 Ti which came out in 2017 had a TDP of 75W.
Of course, people had to actually “sacrifice” themselves by not having 200 fps @ 4K in order to use a lower TDP card.
(Curiously your 300W GTX780 had all of 21% more performance than my 75W GTX1050 Ti).
Recently I upgraded my graphics card and again chose based on, amongst other things TDP, and my new one (whose model I don’t remember right now) has a TDP of 120W (I looked really hard and you can’t find anything decent with a 75W TDP) and, of course, it will never give me top of the range performance when playing games (as it so happens it’s mostly Terraria at the moment, so “top of the range” graphics performance would be an incredible waste for it) as I could get from something 4x the price and power consumption.
When I was looking around for that upgrade there were lots of higher performance cards around the 250W TDP mark.
All this to say that people chosing 300W+ cards can only blame themselves for having deprioritized power consumption so much in their choice often to the point of running a space heater with jet-engine-level noise from their cooling fans in order to get an extra performance bump that they can’t actually notice on a blind test.
Yeah my 3090 K|ngP|n pulls over 500w easily, but that’s over 3 8 pin PCIe cables, all dedicated. Power delivery was something I took seriously when getting that card installed, as well as cooling. Made sure my 1300w PSU had plenty of dedicated PCIe ports.
Well…I mean…that’s kind of bound to happen when you draw 600W into a device that size I suppose. I feel like they’ve had this issue with every *090 card, whether it be cables or otherwise.
God damn… 600w?!
Is this thing supposed to double as space heater?
What do people do in the summer. That’s got to cost monthly cash to run it and cool it. 20 bucks.?
Running 600W for 12 hours a day at $0.10 per kWh costs $0.72 a day or $21.60 a month. Heat pumps can move 3 times as much heat as the electricity they consume, so roughly another $7.20 for cooling.
All electronics double as space heaters, there’s only a minuscule amount of electricity that’s not converted to heat.
10¢ a kWh is fucking cheap in a global context. 3x that is not uncommon.
https://www.statista.com/statistics/263492/electricity-prices-in-selected-countries/
Thank you for the good math!
Yup. Pretty dumb.
My +10 year old GTX780 would pull 300W at full tilt, and it has only a ridiculous fraction of the compute power. The radeon 6990 would pull +400W… high end GPUs have been fairly power hungry for literally more than a decade.
GTX 780 released in 2013?
RTX 3090 was 350W?
RTX 4090 was 450W?
So if by decades you mean this generation… then sure.
Haha yeah I mistyped the years, it was supposed to be +10 and not +20…nevertheless these cards have been pulling at least 3-400W for the past 15 years.
As it so happens around a decade ago there was period when they tried to make Graphics Cards more energy efficient rather than just more powerful, so for example the GTX 1050 Ti which came out in 2017 had a TDP of 75W.
Of course, people had to actually “sacrifice” themselves by not having 200 fps @ 4K in order to use a lower TDP card.
(Curiously your 300W GTX780 had all of 21% more performance than my 75W GTX1050 Ti).
Recently I upgraded my graphics card and again chose based on, amongst other things TDP, and my new one (whose model I don’t remember right now) has a TDP of 120W (I looked really hard and you can’t find anything decent with a 75W TDP) and, of course, it will never give me top of the range performance when playing games (as it so happens it’s mostly Terraria at the moment, so “top of the range” graphics performance would be an incredible waste for it) as I could get from something 4x the price and power consumption.
When I was looking around for that upgrade there were lots of higher performance cards around the 250W TDP mark.
All this to say that people chosing 300W+ cards can only blame themselves for having deprioritized power consumption so much in their choice often to the point of running a space heater with jet-engine-level noise from their cooling fans in order to get an extra performance bump that they can’t actually notice on a blind test.
Yeah pulling nearly 600w through a connector designed for 600w maximum just seems like a terrible idea. Where’s the margin for error?
Yeah my 3090 K|ngP|n pulls over 500w easily, but that’s over 3 8 pin PCIe cables, all dedicated. Power delivery was something I took seriously when getting that card installed, as well as cooling. Made sure my 1300w PSU had plenty of dedicated PCIe ports.