- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
“Jensen sir, 50 series is too hot”
“Easy fix with my massive unparalleled intellect. Just turn off the sensor”
If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!
If you buy a Nvidia GPU, you are part of the problem here.
To be clear. The only required rt Indiana Jones utilizes is raytraced global illumination which does require tensor cores to work. But as long as you have a 20 series card or later you should be able to get playable performance if you manage your settings correctly. It only becomes super heavy when you enable rt reflections, rt sunshadows, or full path tracing. The latter of which being VERY expensive and what I’d assume most people think when they think of ray tracing. It does look really really good though and personally myself I’d rather play that game at 60 fps (or lower let’s be real) in order to play with full pathtracing instead of playing with just the RTGI at a much higher fps. I’d at least recommend turning on the RT sunshadows if you can because shadows without it are very shimmery and aliased. Especially foliage. In games like Indiana Jones that have been designed from the ground up with raytracing in mind it makes a gigantic difference in how grounded the world feels. The level of detail they baked into every asset is insane and path-tracing elevates the whole experience a huge amount when compared to the default RTGI because every nook and cranny on every object casts accurate shadows and bounce lighting on itself and the environment.
I assume Doom is going to be the same way.
Just chiming in that I played Indiana Jones with 0 problems and great performance on my 6800 XT. And that was without any FSR, which I’m not sure if it’s even available, yet.