Is there any GPU that stands out to guys as a giod value, or do you believe that everybody should skip them?
I’m liking the 5070 Ti with 16GB 256-bit transfer speed 896 GB/s for $750USD. The 5080 for $1000USD has 16GB 256-bit for 960 GB/s. I don’t see value for the extra $250.
The both have 2x 9th Gen NVENC encoder. The 5080 has 2x 6th Gen decoder, 5070 Ti has 1x 6th Gen decoder. I can use that for OBS recording while watching other videos.
I’m planning on getting the 5090 FE provided scalpers don’t ruin it (which is the most likely scenario by the way). I have a 3080 and have recently upgraded to 4k OLED, going from 3k to 4k has taken a serious blow to the FPS and I’d like to play some of the newer games at high settings and high FPS so I’m due for an update.
I have a no problem with DLSS on the games I played so far. Whichever artifacts it may have they “disappear” during gameplay. You are not counting pixels while you are playing. I know this is not the solution we want but we need to be realistic here, there is a reason why CGI can’t be rendered in real time. Certainly we are still far away from that type of quality yet the games we play are rendering in real time. We cannot afford the size and price of CGI workstations so we have to rely on these “gimmicks” to make for it.
I understand your reasoning about DLSS. I don’t agree but allis well. For 4K, you will need a 6090, and a 7090, etc.
Why the FE over an AIB card?
I don’t think a 6090 or future card will be enough for 4k without some short of DLSS or frame generation shenanigans, because by the time those cards releases, graphics would have “evolved” to a point where they again are no longer enough. The eternal obsolescence cycle…
I prefer the FE because is the only 2 slots card that will be available at launch and I don’t have the need for the extra fans and size. FE are normally on par, if not better than they AIB counterpart as long as you stick to air cooled.