Originally Posted by jkkyler
Curious JR, do you think that has to do with the style of games you play - competitive twitch shooters?
Personally I notice a big difference from 30fps to 60 fps but above 60fps I don't seem to notice. TYpically I can run just about anything maxed out at 1440p and get 60 fps ()sometimes need to turn off or down FXAA/MSAA but at that resolution I don't really notice jaggies anyhow.
I tend to game nearly exclusively at 1440p these days with some games such as Sekiro: Shadows die twice at 1080 because there is a glitch that locks FPS at 30 if you go above 1080.
Recently I have been replaying Borderlands 2 because it has been several years and they are getting ready to release Borderlands 3. Amazingly enough with a 1070 I can run that at 4k and get 90-100 fps pretty consistently. That style of graphics isn't photo realistic and is easy to crank fps on.
I edit photos at 4k which is closer to the resolution I shoot at 6000x4000 and tend to edit video using proxy files that are a lower rez bookmark and then render final product back out at 4k.
What's your monitor's refresh rate? I think 120FPS on a 60hz monitor is likely less noticable than 120FPS on a 120hz monitor, because you're actually seeing all 120 frames.
What I can say is this; I'd had my PC set to 60hz for years, and then I got this 144hz monitor and naturally set the refresh rate to 144hz. But a couple times when I was making changes, it reverted back to 60hz refresh rate and the monitor felt broken. As far as games go, I definitely don't like 60FPS anymore; I'd rather take 90+ and lower the graphical detail. And yeah I mostly play FPS type games
It all depends on what your GPU can push. I have a 7600k (I think) and a 1070ti so I don't think I'd get very far at 1440p or 4k, but I can run most games at 2560x1080 and get close to 100FPS. In Destiny 2 I get over 120FPS with everything jacked up and it's so crispy