Quantitative fps / hz flicker improvement

laguerre

[H]ard|Gawd
Joined
Nov 18, 2008
Messages
1,133
Just got done reading this older article about Temporal Rate Conversion.
The human eye's peripheral vision is particularly sensitive to flicker. ...The human eye's sensitivity to flicker is determined by approximately a power of 4 law. It has been determined in various scientific viewer tests that the amount of flicker you see is proportional to frequency to the power 4. A 60Hz field rate is not just a bit better than 50Hz, it is twice as good, since (60/50)PWR4 = 2 (approximately). A temporal rate of 72Hz is twice as good as a 60Hz rate, since (72/60)PWR4 = 2 (approximately).

Although this is talking about flicker from CRTs, and no links for credibility, I still find this may have an indirect relationship to new display technologies in regards to refresh and frames per second. Just to clarify, LCD's don't have flicker, this is a non-issue. I am referring to the statement about the scaling of your ability to see flicker. I doubt the tests mentioned went so far as checking to see if the approximate of the power of 4 law is still relevant at 120hz, but if we were to use it for a generalization, 120hz is x16 better than 60hz (in relation to you're eyes ability to pick up differences).

Note that the scaling is for a reduction in seeing flicker. This doesn't mean your response time improves by 16. I am suggesting you could use this as a way to quantify your ability to appreciate higher fps (justifying a video card upgrade). Flicker could be loosely associated with tearing and motion smearing. Even a small upgrade to 72hz will get give a x2 improvement. So, if you're already pushing +72fps, upgrading to a 75hz monitor or 120hz may make quite an improvement. If you go from 30fps to 60fps you're also looking at x16 reduction.

This may also have an effect on anyone using multiple monitors or a very large monitor. As your peripheral vision is better at picking up movement, if you reduce the inconsistencies in your peripheral vision, you may be able to rely more and have a better response time to stimuli from your peripherals. An improvement from 30 to 36fps nets you x2 reduction in your ability to detect flicker which perhaps is just enough to make your peripheral reliable.

This isn't about reducing the total response time, rather you can view it as improving continuity and improving your gameplay experience. My question would be, is this too far of a reach, or for purposes of justifying an upgrade it's good enough?
 
Just to clarify, LCD's don't have flicker, this is a non-issue.

CCFL backlights do flicker, though. If a CCFL LCD monitor is set at 100% brightness, it probably flickers in the 10,000 to 20,000Hz range and is not an issue. But when using a PWM switch to reduce brightness, some monitors can get below 200Hz, at which point those sensitive to flicker may begin to feel the effects. One of the advantages of LED backlights is that this issue is removed entirely.
 
I'm a bit skeptical. The contrast in flickering is very high compared to the differences between frames (usually) and I struggle to believe that 60 fps were twice as good as 50 fps (60 Hz flicker vs. 50 Hz flicker seems reasonable though). But I have nothing concrete to bring to the table, interesting find regardless.
 
Back
Top