Everyone’s perception is different. I can do 60 fps. I prefer 90 fps minimum and 120 fps target. I see no benefit at 144 or higher. Anything below 60 fps and I just get frustrated. That’s my perception.
30 fps though is something we should move away from. Given how far we’ve come in with all kinds of hardware and software features.
Sure. And I used to be okay downloading my porn at 56kbps. Now I want my smut so hi-def that I can see the actors’ emotional scars. Peoples’ standards change as technology advances. If you want to be stuck in 2001, go right ahead, but that doesn’t mean everyone else has to be.
I remember playing OSRS and Team Fortress 2 on my shitter PC with like 10-20fps.
It was fine back then, considering my brain hadn’t yet normalized 60+, but nowadays I struggle with anything under 50fps. I guess I played too many fast-paced games since then because Switch games that fluctuate between 25-30fps really turn me off from playing.
Everyone has different standards in terms of motion blur they can bear, and you need a certain framerate to achieve that standard at any given speed of motion on screen.
It’s not just about how smooth the game looks, but also how smooth it feels to control. 30 fps is way too sluggish for me. Granted, most people would probably reach a point of diminishing return somewhere after 60 fps, unless you’re someone with the reflexes and hardware (high polling rate mouse, good frame timing on your monitor, low system lag, etc.) to back it up. I’m quite comfy between 120 to 144 fps, but there’s some absolute monsters out there who would probably find that too slow.
If it’s not a very fast moving game, like a turn based RPG, then it doesn’t matter that much, but at least 60 fps is still a must for me to not look like a slideshow.
Latency plays a big part too, that’s true. I mentioned that in another comment.
Though how bad a higher latency feels is also tied to how fast you move your mouse. Slowly panning across the map of your city builder makes latency less of an issue than wanting to hit flickshots in Counterstrike.
Latency and framerate go hand in hand, though depending on the game, one might be more important to you than the other.
To me, 30fps is unbearable in fast paced games, but okay in slow paced games. This is a slow paced game, so I’m fine as long as the fps stays above 24 with a 1% low of at least 20.
Everone can play what they want but 30 fps is unbearable in most -not all- games
Maybe I’m just not very observant but I can barely tell the difference between 30 fps and 60 fps. I only start to notice below 25.
Everyone’s perception is different. I can do 60 fps. I prefer 90 fps minimum and 120 fps target. I see no benefit at 144 or higher. Anything below 60 fps and I just get frustrated. That’s my perception.
30 fps though is something we should move away from. Given how far we’ve come in with all kinds of hardware and software features.
Wild. 60 looks terrible to me. I can’t really tell the difference above 120fps though.
In my day 30fps in Unreal Tournament was considered reasonable.
Kids these days…
Well unreal Tournament is older than me and i am a legal adult. So suffice to say that the technology wasn’t really there yet for games
You would be wrong.
Sure. And I used to be okay downloading my porn at 56kbps. Now I want my smut so hi-def that I can see the actors’ emotional scars. Peoples’ standards change as technology advances. If you want to be stuck in 2001, go right ahead, but that doesn’t mean everyone else has to be.
I played RuneScape 3 for years at 18 fps on max settings on my shitter computer and I honestly couldn’t tell at all and had fun the whole time.
I remember playing OSRS and Team Fortress 2 on my shitter PC with like 10-20fps.
It was fine back then, considering my brain hadn’t yet normalized 60+, but nowadays I struggle with anything under 50fps. I guess I played too many fast-paced games since then because Switch games that fluctuate between 25-30fps really turn me off from playing.
I’ve played plenty of minecraft at 15-20 fps and had an awesome time.
Unbearable is wholly subjective.
Can agree. I can play 30fps without complaints because most of my life I was playing on low-end PCs
Let’s put it this way:
Everyone has different standards in terms of motion blur they can bear, and you need a certain framerate to achieve that standard at any given speed of motion on screen.
It’s not just about how smooth the game looks, but also how smooth it feels to control. 30 fps is way too sluggish for me. Granted, most people would probably reach a point of diminishing return somewhere after 60 fps, unless you’re someone with the reflexes and hardware (high polling rate mouse, good frame timing on your monitor, low system lag, etc.) to back it up. I’m quite comfy between 120 to 144 fps, but there’s some absolute monsters out there who would probably find that too slow.
If it’s not a very fast moving game, like a turn based RPG, then it doesn’t matter that much, but at least 60 fps is still a must for me to not look like a slideshow.
Latency plays a big part too, that’s true. I mentioned that in another comment.
Though how bad a higher latency feels is also tied to how fast you move your mouse. Slowly panning across the map of your city builder makes latency less of an issue than wanting to hit flickshots in Counterstrike.
Latency and framerate go hand in hand, though depending on the game, one might be more important to you than the other.
Which is where frame interpolation gets funny.
We really should move away from 30 fps as a baseline for PC gaming.
Am I old? I only notice a difference if it drops below 30.
I do see a difference at 60, but playing at 30 isn’t a issue.
To me, 30fps is unbearable in fast paced games, but okay in slow paced games. This is a slow paced game, so I’m fine as long as the fps stays above 24 with a 1% low of at least 20.