Any game above or equal to 30 fps is good for me. And some people argued with me saying “NOOOO! You cannot play this game below 60 fps!” You cannot, but I can 😝
Valid opinion except for DMC, Gran Turismo and Half-Life
I don’t know the game DMC, but you’re right about Gran Turismo. Half-Life, it depends, I can play the first one at 30 fps (in fact it runs at 20 fps, more or less, in my computer).
DMC = Devil May Cry
You gotta appreciate what you have sometimes. I remember my time as a kid playing games at potato quality (420p or less!!!), just for more than 30fps.
I’m old enough to remember the quasi-religious arguments over 24 (film) vs 25 (PAL), vs 29.97 (NTSC), vs 30 (monitors)
People complaining about there being used instead of their
Here’s some math:
30 FPS means a frame is changed every 33.33ms;
60 FPS means a frame is changed every 16.66ms;
144 FPS means a frame is changed every 6.94ms.The differences between 29-30 FPS, 59-60 FPS and 143-144 FPS are respectively 1.14ms, 0.28ms and 0.048ms (or 48μs (or a spoonful of context switches on Windows 10 (windows 10 does many context switches))).
If you know someone who complains that their frame rate is 100/s instead of 144/s, they complain about a 3.05ms delay that speedrunners and probably competitive players would struggle to notice.
Speedrunners may even change the fps to be intentionally lower or higher for some glitches to even work.
I remember doing that with Half Life, I always found it weird that pre-run console command bindings were allowed - then again, speedruns would be way less crazy without them.
It would be hard for speedrunners to make action on those 3.05ms, but they would notice it. I can feel when an image is choppier than it can be.
It’s not the end of the world, I’m finding lower frame rates aren’t as frustrating as I remember now that I’m doing a lot of work on the steamdeck and rog ally. 45 is as low as I’ll go on games that are pretty though. A lot of my early time in a game is spent tweaking and finding maybe 2 ‘best’ configurations, one for higher frames and one for better visuals and make my decision from there on what I can comfortably tolerate.
Its why I usually say it only matters after every doubling of hz, 30 > 60/75 > 120/144 > 240 > 480 with dimishing returns based on content being displayed.
165 is effectively 144, and any framerate near each bracket is paired with whatever’s closest, with the exceptions such as 360hz panels with bfi are different than ones without it.
It’s also why I run DRG on a 40f/s cap, upscaled from 90% of my monitor’s resolution and mostly low settings even if I’m capable of 4 times as much load :3
(that and the generated heat being unbearable as it is)
((and the fact that I’m stingy and if I can decimate my power consumption with a relatively low graphical fidelity difference, I will))
Their
San Andreas players being used to -25fps since the old days 😉
I used to play Kenshi at 1-2 fps. Now I’ve got a better CPU so it’s not fun anymore…
The fuck, that’s a power point my dude.
I mean there is Kaze Emanuar who has rewritten the decompiled source code for Mario 64 and getting it to run up to 60fps on a real n64 with his custom levels having graphics bordering on GameCube level. Tons of really cool optimizations went in and If I understand correctly everything between but not including 30fps and 60fps looks like shit on real hardware.
If one person can do this kind of optimization AAA devs should be able to as well but I guess shareholders something something… Really though there is no excuse for anything looking worse than several console generations ago to not run at least 60 and yet here we are.
I only recently got a screen that goes above 60. I’m not sure I can tell there’s a difference (but 4k is nice).
I felt the same when I got a high refresh rate monitor, until I went back to a 60fps monitor that was the same resolution and size.
DEFINITELY check your OS screen hz settings!!! i got my 240 monitor last year and spent 2-3 days very confused thinking it was all placebo, then i randomly found the settings and it turned out windows had default set it to run at 60hz. crank that shit up and the difference is crystal clear, trust me!
I played on a Radeon 7700 and a shit i3 (low end of mid tier when I built the PC in 2013) until 2020. I ended up replacing the entire system, although I originally just wanted to upgrade the CPU, which would need a new motherboard, which meant new RAM. I already had a new PSU, so really just needed a new GPU and case to have a whole new PC. I had to wait a year in the step up queue for EVGA to get a 3080 (bought a 2060 after trying unsuccessfully to buy a 3080 directly), but definitely worth the wait.
I averaged 20 fps for years on low settings. I had no idea how bad it was until I upgraded. Now I get 90-120fps on high/ultra settings for most games (except Star Citizen, because Star Citizen)
Once you go
black144fps, you can never go back! The same applies to high end headphones and audio systems as well.Old 120 Hz veteran here, when it was all new and shiny: It was fairly easy to go back to 60 Hz. The loss in quality was absolutely not worth it back in the day. The displays today got better, but so did my standard. I think it will take OLED or microLED to get me to higher refresh rates again.
I don’t notice a difference above 30 FPS
It strongly depends on the game
I’m talking about everything. Not just games but also video playback, scrolling, etc
Guy has his monitor set to 30Hz
I’ve used multiple monitors and never noticed a difference
Probably just using some old HDMI cable that can’t go over 30hz
You definetly notice a difference on your mouse because of more cursor showing on more hz, but generally more hz is not relevant at all except you play fast paced games like CSGO or Apex Legends
I don’t usually game so that might be it
Also I use a laptop (I told you I don’t game alot) no HDMI cable for me
I definitely noticed a difference in FPS multiplayer games when I was usually playing around 25-30 fps and suddenly jumped at 100+.
It was night and day.