Just got done reading this older article about Temporal Rate Conversion.
Although this is talking about flicker from CRTs, and no links for credibility, I still find this may have an indirect relationship to new display technologies in regards to refresh and frames per second. Just to clarify, LCD's don't have flicker, this is a non-issue. I am referring to the statement about the scaling of your ability to see flicker. I doubt the tests mentioned went so far as checking to see if the approximate of the power of 4 law is still relevant at 120hz, but if we were to use it for a generalization, 120hz is x16 better than 60hz (in relation to you're eyes ability to pick up differences).
Note that the scaling is for a reduction in seeing flicker. This doesn't mean your response time improves by 16. I am suggesting you could use this as a way to quantify your ability to appreciate higher fps (justifying a video card upgrade). Flicker could be loosely associated with tearing and motion smearing. Even a small upgrade to 72hz will get give a x2 improvement. So, if you're already pushing +72fps, upgrading to a 75hz monitor or 120hz may make quite an improvement. If you go from 30fps to 60fps you're also looking at x16 reduction.
This may also have an effect on anyone using multiple monitors or a very large monitor. As your peripheral vision is better at picking up movement, if you reduce the inconsistencies in your peripheral vision, you may be able to rely more and have a better response time to stimuli from your peripherals. An improvement from 30 to 36fps nets you x2 reduction in your ability to detect flicker which perhaps is just enough to make your peripheral reliable.
This isn't about reducing the total response time, rather you can view it as improving continuity and improving your gameplay experience. My question would be, is this too far of a reach, or for purposes of justifying an upgrade it's good enough?
The human eye's peripheral vision is particularly sensitive to flicker. ...The human eye's sensitivity to flicker is determined by approximately a power of 4 law. It has been determined in various scientific viewer tests that the amount of flicker you see is proportional to frequency to the power 4. A 60Hz field rate is not just a bit better than 50Hz, it is twice as good, since (60/50)PWR4 = 2 (approximately). A temporal rate of 72Hz is twice as good as a 60Hz rate, since (72/60)PWR4 = 2 (approximately).
Although this is talking about flicker from CRTs, and no links for credibility, I still find this may have an indirect relationship to new display technologies in regards to refresh and frames per second. Just to clarify, LCD's don't have flicker, this is a non-issue. I am referring to the statement about the scaling of your ability to see flicker. I doubt the tests mentioned went so far as checking to see if the approximate of the power of 4 law is still relevant at 120hz, but if we were to use it for a generalization, 120hz is x16 better than 60hz (in relation to you're eyes ability to pick up differences).
Note that the scaling is for a reduction in seeing flicker. This doesn't mean your response time improves by 16. I am suggesting you could use this as a way to quantify your ability to appreciate higher fps (justifying a video card upgrade). Flicker could be loosely associated with tearing and motion smearing. Even a small upgrade to 72hz will get give a x2 improvement. So, if you're already pushing +72fps, upgrading to a 75hz monitor or 120hz may make quite an improvement. If you go from 30fps to 60fps you're also looking at x16 reduction.
This may also have an effect on anyone using multiple monitors or a very large monitor. As your peripheral vision is better at picking up movement, if you reduce the inconsistencies in your peripheral vision, you may be able to rely more and have a better response time to stimuli from your peripherals. An improvement from 30 to 36fps nets you x2 reduction in your ability to detect flicker which perhaps is just enough to make your peripheral reliable.
This isn't about reducing the total response time, rather you can view it as improving continuity and improving your gameplay experience. My question would be, is this too far of a reach, or for purposes of justifying an upgrade it's good enough?