Lot of those games are also hot garbage. Baldur’s Gate 3 may be the only standout title of late where you don’t have to qualify what you like about it.
I think the recent layoffs in the industry also portend things hitting a wall; games aren’t going to push limits as much as they used to. Combine that with the Steam Deck-likes becoming popular. Those could easily become the new baseline standard performance that games will target. If so, a 1080ti could be a very good card for a long time to come.
You’re misunderstanding the issue. As much as “RTX OFF, RTX ON” is a meme, the RTX series of cards genuinely introduced improvements to rendering techniques that were previously impossible to pull-off with acceptable performance, and more and more games are making use of them.
Alan Wake 2 is a great example of this. The game runs like ass on 1080tis on low because the 1080ti is physically incapable of performing the kind of rendering instructions they’re using without a massive performance hit. Meanwhile, the RTX 2000 series cards are perfectly capable of doing it. Digital Foundry’s Alan Wake 2 review goes a bit more in depth about it, it’s worth a watch.
If you aren’t going to play anything that came out after 2023, you’re probably going to be fine with a 1080ti, because it was a great card, but we’re definitely hitting the point where technology is moving to different rendering standards that it doesn’t handle as well.
The Rog seems to be doing a little better, but not by much. They’re both hitting sub 30fps at 720p.
My point is that if that kind of handheld hardware becomes typical, combined with the economic problems of continuing to make highly detailed games, then Alan Wake 2 is going to be an abberation. The industry could easily pull back on that, and I welcome it. The push for higher and higher detail has not resulted in good games.
I own a 1080ti and there was recently a massive update to Allan Wake 2 that made it more playable on pascal GPUs. Digital foundry did a video on it: http://youtu.be/t-3PkRbeO8A
I don’t know of any current game that can’t run at least 1080p30fps on 1080ti. But of course my knowledge is not exhaustive.
I wouldn’t expect every “next-gen” game to get the same treatment as Alan Wake 2 going forward. But we’re 4 years into the generation and there has probably been less than 10 games that were built to take full advantage of modern console hardware. My 1080ti has got a few more good years in it.
“Instructions” is probably the wrong word here (I was mostly trying to dumb it down for people who aren’t familiar with graphics rendering terminology).
The idea is that it allows offloading more work to the GPU in ways that are much better performance-wise. It just requires that the hardware actually support it, which is why you basically need an RTX card for Alan Wake 2 (or whichever AMD GPU supports Mesh Shaders, I’m not as familiar with their cards).
Ah, mesh shaders. Cool stuff. AMD retroactively added them to their old GPUs in drivers. I think same goes for Intel’s post-Ivybridge GPUs(I think send opcode can throw primitives into 3d pipeline, if you are interested, you can go read docs). I guess Nvidia can do something similar.
And even if they don’t have such straightforward way of implementing them, they probably(my guess, can be wrong) can be emulated in geometry shaders.
What I don’t like is apparent removal of vertex fetcher, but maybe there will be extension that will return it.
it probably the best performance per dollar u can get but a lot of modern games are unplayable on it.
Lot of those games are also hot garbage. Baldur’s Gate 3 may be the only standout title of late where you don’t have to qualify what you like about it.
I think the recent layoffs in the industry also portend things hitting a wall; games aren’t going to push limits as much as they used to. Combine that with the Steam Deck-likes becoming popular. Those could easily become the new baseline standard performance that games will target. If so, a 1080ti could be a very good card for a long time to come.
Edit: Here’s another comment I made with links and more information on why this is going to be more common going forward. There’s a very real and technical reason for using these new rendering strategies and it’s why we’ll start seeing more and more games require at least an RTX series card.
You’re misunderstanding the issue. As much as “RTX OFF, RTX ON” is a meme, the RTX series of cards genuinely introduced improvements to rendering techniques that were previously impossible to pull-off with acceptable performance, and more and more games are making use of them.
Alan Wake 2 is a great example of this. The game runs like ass on 1080tis on low because the 1080ti is physically incapable of performing the kind of rendering instructions they’re using without a massive performance hit. Meanwhile, the RTX 2000 series cards are perfectly capable of doing it. Digital Foundry’s Alan Wake 2 review goes a bit more in depth about it, it’s worth a watch.
If you aren’t going to play anything that came out after 2023, you’re probably going to be fine with a 1080ti, because it was a great card, but we’re definitely hitting the point where technology is moving to different rendering standards that it doesn’t handle as well.
So here’s two links about Alan Wake 2.
First, on a 1080ti: https://youtu.be/IShSQQxjoNk?si=E2NRiIxz54VAHStn
And then on a Rog Aly (which I picked because it’s a little more powerful than the current Steam Deck, and runs native Windows): https://youtu.be/hMV4b605c2o?si=1ijy_RDUMKwXKQQH
The Rog seems to be doing a little better, but not by much. They’re both hitting sub 30fps at 720p.
My point is that if that kind of handheld hardware becomes typical, combined with the economic problems of continuing to make highly detailed games, then Alan Wake 2 is going to be an abberation. The industry could easily pull back on that, and I welcome it. The push for higher and higher detail has not resulted in good games.
I own a 1080ti and there was recently a massive update to Allan Wake 2 that made it more playable on pascal GPUs. Digital foundry did a video on it: http://youtu.be/t-3PkRbeO8A
I don’t know of any current game that can’t run at least 1080p30fps on 1080ti. But of course my knowledge is not exhaustive.
I wouldn’t expect every “next-gen” game to get the same treatment as Alan Wake 2 going forward. But we’re 4 years into the generation and there has probably been less than 10 games that were built to take full advantage of modern console hardware. My 1080ti has got a few more good years in it.
Here is an alternative Piped link(s):
http://piped.video/t-3PkRbeO8A
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Can you reference those instructions more specifically
“Instructions” is probably the wrong word here (I was mostly trying to dumb it down for people who aren’t familiar with graphics rendering terminology).
Here’s a link to the Digital Foundry video I was talking about (didn’t realized they made like 5 videos for Alan Wake 2, took a bit to find it).
The big thing, in Alan Wake 2’s case, is that it uses Mesh Shaders. The video I linked above goes into it at around the 3:38 mark.
AMD has a pretty detailed article on how they work here.
This /r/GameDev post here has some devs explaining why it’s useful in a more accessible manner.
The idea is that it allows offloading more work to the GPU in ways that are much better performance-wise. It just requires that the hardware actually support it, which is why you basically need an RTX card for Alan Wake 2 (or whichever AMD GPU supports Mesh Shaders, I’m not as familiar with their cards).
Here is an alternative Piped link(s):
Here’s a link to the Digital Foundry video I was talking about
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Ah, mesh shaders. Cool stuff. AMD retroactively added them to their old GPUs in drivers. I think same goes for Intel’s post-Ivybridge GPUs(I think
send
opcode can throw primitives into 3d pipeline, if you are interested, you can go read docs). I guess Nvidia can do something similar.And even if they don’t have such straightforward way of implementing them, they probably(my guess, can be wrong) can be emulated in geometry shaders.
What I don’t like is apparent removal of vertex fetcher, but maybe there will be extension that will return it.
I could be wrong, but I’m pretty sure Nvidia has patched them into the GTX series, they’re just really slow compared to RTX cards.
https://www.khronos.org/blog/ray-tracing-in-vulkan
https://microsoft.github.io/DirectX-Specs/d3d/Raytracing.html
I suspect that’s the exception and will be for most games.
I bought a GTX1080 when it was released. I played Starfield in 4K on medium settings just fine at 50-60 FPS. Not sure what you mean by “unplayable”.