Human eye and FPS (frames per second)

Grimham

[H]ard|Gawd
Joined
Jul 20, 2004
Messages
1,613
I've just saw a post (for about the 10,000th) time where someone was using the old "the human eye can't tell the difference above 30fps anyway.....blah, blah, blah). I don't know why, but it bugs the hell out of me when I hear that, kind of like when someone spells "no one" as "noone" (there's no such word people!). Anyway, I figured I'd share this link that I found a long time ago. It will hopefully educate some of the people who keep spouting that nonsense. I really wish this would go into the videocard/gaming FAQ.


http://amo.net/NT/02-21-01FPS.html
 
i don't know anything about how many frames the human eye can detect. all i know is i can tell the difference when a game slows from 100 frames to 60. at 100 frames it just seems to move "faster", especially if i have to turn around immediately.
 
Here's something I posted a good while ago. This topic borders on my area of University Research...

Do yourself a favor, go read some books, maybe learn something, then come back.

If you read books, you'd know that human vision is a constant phenomena. We don't see in frames like a computer displays, nor do we have a discrete "refresh rate". While the firing frequency of optical ganglia is limited, it's limit is quite high. Therefore, incoming visual information is almost like an analog signal, rather than a digital one. We are constantly seeing things. Our optical ganglia are also tied to motor-cells (which help our eyes move our gaze from very bright objects instantly, like when accidently stairing down a laser pointer) which are able to perceive and coordinate action within about 0.013ms (a lot lower than the 2.5ms response that "24fps" would account for).

24fps is NOT adaquate under most situations. The figure is quoted for video displayed in movie theaters, which have a very distinct set of conditions. What are these conditions? Well, have you ever noticed that gaming in the dark makes the game seem smoother? I urge you to download the Crysis demo, wait till midnight, turn the settings up high, and go for it. Much smoother than with the lights on. Why is this you ask? It's simple. Afterimages. When our eyes percieve light, a photopigment in each recepter cell is isomerized, which sends an electical-chemical to the nearest ganglion cell. Six of these pulses must travel to the local ganglion before a signal is actually sent to the brain. Bright light isomerizes (bleaches) many pigments at once and this puts great strain on the nerve fiber, and when it reverts back to normal, the nerve fires in a pattern which codes for the negative afterimage of what you just saw. Look at a waterfall for an hour, then look at the ground: It will appear to move upwards, as this is the opposite of water moving downwards. Additionally, we have our own version of interframe blending, which helps fast-moving images stay in our field of view for a few milliseconds, even if they're not actually still there. The image is blurred (due to a lack of visual information) but it allows us to percieve what we need to to notice the object. In summary: This afterimage is displayed for a few milliseconds between each frame. Coupled with interframe blending, you see a seemless image, with little to no stutter.

Television NTSC signal is sent at 30fps because of the conditions one usually watches TV in... Lighter conditions where the isomerization isn't so pronounced. In addition, due to interleave, where the screen is updated two times each refresh, we can see much more information with less television bandwidth. IMAX films are presented at 48fps for greater depth and photorealism. Independant studies on computer screens (with no interframe blending) have shown frame-rates in the 50s as the baseline for the cutoff of fluidity...

So, the antiquated notion of '24fps" was largely provoked by film buffs and early gamers. There is a LOT more information on this, but I honestly don't have time to get into it.
 
Those that cant tell the difference are maybe telling the truth.
Its a shame for them though as we can perceive a whole 30fps+ more :D
 
I've never baught into the statement that human eyes aren't sensitive to high end FPS changes in video games. If there is anything science has discovered is that living things are the most advanced known things in existance and we would be wrong to doubt their capabilities.

Our bodies are so much more than we'll ever know, if there is anything worth having faith in, it's ourselves.
 
Meh, as long as I get a mediocre to decent framerate where I can just play a game smoothly...I'm all good. No one's ever gonna win this debate, as with any message board subject like this, people are gonna cling to each side no matter what.
 
Single Player = 35+ and it's all good baby.

Multiplayer ( Competitive ) = 80+
 
if you were at 30 fps and were WATCHING a game it would be fine, but you can 'feel' the fps change from even 100 to 50
 
When a framerate changes from around 40fps to 60fps, it is noticable. It does generally look smoother. When a higher framerate changes, for example 80fps jumping to 100 fps, it isnt as noticable, as our eyes arnt that good tbh. I would say roughly 60fps is smooth enough for me- but 30fps is fine for me.
 
I've generally thought it was pointless to go over the refresh of your monitor as the extra frames aren't getting displayed anyway...On the lcd's most of us use, over 60-75 fps doesn't add much as the extra frames don't get fully displayed. Turning off double or triple buffering and enabling v-sync is probably the smoothest experience you can get, no lag and every frame should be fully displayed at the refresh rate of the monitor. If you have a 120hz refresh, then for sure 120fps will look the best. However, I'll agree that much over 60fps gives diminishing returns, more fps is slightly smoother but just barely perceivable in the best conditions...
 
Ahaha... I usually jump in every so many months or so on this one.

With the new 120hz LCD's showing up this year, it will finally get rid of the 60hz madness that LCD users have been forced to 'live with' since the inception of LCD. I had an LCD for a year, and my gaming suffered, so I went back to CRT.

120hz will help immensely. But it will never totally eliminate the input lag:

http://www.behardware.com/articles/632-1/lcds-images-delayed-compared-to-crts-yes.html

60 hz LCD users on average play about 2 frames behind what a CRT user does. So while the actual gameplay may look 'smooth' (at whatever FPS you consider smooth) - your accuracy is totally a different thing. There is also considerable input lag from older wireless mice, vsync, videobuffering, and other areas, which probably puts the average users 5 frames behind what they 'should' be seeing.

I might try a 120hz LCD to see if its 'passable', but what I'm really holding out for are the laser displays (instead of an electron gun in CRT's, but almost exactly the same idea as a CRT)
 
Here's something I posted a good while ago. This topic borders on my area of University Research...

Weenis said:
Do yourself a favor, go read some books, maybe learn something, then come back.

If you read books, you'd know that human vision is a constant phenomena. We don't see in frames like a computer displays, nor do we have a discrete "refresh rate". While the firing frequency of optical ganglia is limited, it's limit is quite high. Therefore, incoming visual information is almost like an analog signal, rather than a digital one. We are constantly seeing things. Our optical ganglia are also tied to motor-cells (which help our eyes move our gaze from very bright objects instantly, like when accidently stairing down a laser pointer) which are able to perceive and coordinate action within about 0.013ms (a lot lower than the 2.5ms response that "24fps" would account for).

24fps is NOT adaquate under most situations. The figure is quoted for video displayed in movie theaters, which have a very distinct set of conditions. What are these conditions? Well, have you ever noticed that gaming in the dark makes the game seem smoother? I urge you to download the Crysis demo, wait till midnight, turn the settings up high, and go for it. Much smoother than with the lights on. Why is this you ask? It's simple. Afterimages. When our eyes percieve light, a photopigment in each recepter cell is isomerized, which sends an electical-chemical to the nearest ganglion cell. Six of these pulses must travel to the local ganglion before a signal is actually sent to the brain. Bright light isomerizes (bleaches) many pigments at once and this puts great strain on the nerve fiber, and when it reverts back to normal, the nerve fires in a pattern which codes for the negative afterimage of what you just saw. Look at a waterfall for an hour, then look at the ground: It will appear to move upwards, as this is the opposite of water moving downwards. Additionally, we have our own version of interframe blending, which helps fast-moving images stay in our field of view for a few milliseconds, even if they're not actually still there. The image is blurred (due to a lack of visual information) but it allows us to percieve what we need to to notice the object. In summary: This afterimage is displayed for a few milliseconds between each frame. Coupled with interframe blending, you see a seemless image, with little to no stutter.

Television NTSC signal is sent at 30fps because of the conditions one usually watches TV in... Lighter conditions where the isomerization isn't so pronounced. In addition, due to interleave, where the screen is updated two times each refresh, we can see much more information with less television bandwidth. IMAX films are presented at 48fps for greater depth and photorealism. Independant studies on computer screens (with no interframe blending) have shown frame-rates in the 50s as the baseline for the cutoff of fluidity...

So, the antiquated notion of '24fps" was largely provoked by film buffs and early gamers. There is a LOT more information on this, but I honestly don't have time to get into it.

Ouch... it's best not to judge people with little knowledge about them, is it? :p

As it goes for me and a few others here, I do notice when the FPS goes from 60 to 50, but I guess >80 is ideal for me.
 
People have different preferences....Multiplayer (competitive) around 45 fps is plenty for some of us, IMO 80+ is way overkill.

yea all i need is 40-60 for muitiplayer, but i'm fine with 30-45fps. single player. I tend to get more then that now, but in the days I didnt have this 8800GT. that what I got used to, plus aslong it was same fps like always 30 or 40 fps it was smooth.
 
Going back to Arcygenical's post, When watching DVD's on my computer, I can notice the slowing/stuttering when the camera is moving quickly, whereas in a theatre I cannot. It's gotten to the point where if I watch movies, I have to do it at night :eek:.

As for gaming, I'm happy with anywhere above 30. 40+is preferable for MP, but I have an old 9600XT, so I've gotten used to low framerates in newer games. Cant wait till I get my 8800GT :D
 
The only reason you'll want higher frames per second in video gaming is so you have an overhead buffer of performance for higher intensity action scenes where there's more activity on the screen. This allows for the computer to have more overhead so you don't get the low frames per second lag or stuttering. Other than that, around 30 frames per second at a constant rate is the max the human eye distinguishes as smooth playback.
 
I think he's talking about people who don't suck at Multiplayer.

COD4 I was really good at, usually top 3 of the server. And I never had above 30 fps ;x usually low 20's. People can adjust to it. Ima have to un-adjust once my new PC gets here on friday:D
 
..... Other than that, around 30 frames per second at a constant rate is the max the human eye distinguishes as smooth playback.


That is completely wrong. This incorrect belief is what this whole thread is about, read the link in the original thread.
 
That's great. The link looks like someone's blog or something. lol. There's no proof or citing of materials. There's a reason standards were developed and why they are still put to use like that today. The differences are so negligible that you can't even see the extra frames of video to distinguish between higher frame rates. It becomes redundant any higher than 30fps for realtime video playback. Try encoding video of a panning scene without frame doubling.. do everything from 10fps on up until you can't see a difference. The cutoff point is around 30fps.
 
I can certainly tell the difference between gaming at 30fps and 60fps. Anything above 60 I can tell is much more fluid, and seems 'faster'. That's not to say I don't mind if it's at 30, 45, or 60 fps. I'm just stating.
 
It has to do with monitor refresh rate and hardware supplying the signal.
 
That's great. The link looks like someone's blog or something. lol. There's no proof or citing of materials. There's a reason standards were developed and why they are still put to use like that today. The differences are so negligible that you can't even see the extra frames of video to distinguish between higher frame rates. It becomes redundant any higher than 30fps for realtime video playback. Try encoding video of a panning scene without frame doubling.. do everything from 10fps on up until you can't see a difference. The cutoff point is around 30fps.



We are in the Videocard forum, not the "Home Theater" or a "Movies" forum. The original references to frames per second are toward videocards and gaming. If you want to discuss frames per second regarding the film/multimedia aspects then that's for a different place. The bottom line (pertaining to videocards/gaming) is the human eye can distinguish MUCH more than 30 frames per second regardless of weather you feel 30 fps is smooth for you or not.
 
That's great. The link looks like someone's blog or something. lol. There's no proof or citing of materials. There's a reason standards were developed and why they are still put to use like that today. The differences are so negligible that you can't even see the extra frames of video to distinguish between higher frame rates. It becomes redundant any higher than 30fps for realtime video playback. Try encoding video of a panning scene without frame doubling.. do everything from 10fps on up until you can't see a difference. The cutoff point is around 30fps.
For filmed video, sure, 30fps is probably where most people perceive smoothness. That's great, but does it have much relevance to video games?

Any motion in filmed video is naturally blurred by the amount of time the film is exposed to light. Almost all of the light information for that "frame" of time is captured by the film. With video games, each frame is a precise picture of that instant at the beginning of the frame, rather than encompassing the whole duration of that frame. Without that natural blurring, our eyes can much more easily perceive the differences between frames and 30fps is no longer enough for fluidity.

Games like Crysis attempt motion blur post-processing not just because it's neat-o, but because it helps preserve fluidity in motion in a game that we all know tends towards low framerates ;). Of course, this doesn't help eliminate the noticeable difference in responsiveness at low framerates, but it's a step in the right direction IMO.
 
The thread title was for "human eye and frames per second". It isn't exclusively only for video gaming. When you are dealing with a PC i/o, there are many variables that come into play. Monitor refresh rate and response time, the graphics card ability to output a synchronized signal, etc.
 
The thread title was for "human eye and frames per second". It isn't exclusively only for video gaming. When you are dealing with a PC i/o, there are many variables that come into play. Monitor refresh rate and response time, the graphics card ability to output a synchronized signal, etc.
Yes of course. The point I was trying to make is that this magical maximum framerate that the eye can perceive is extremely dependent on the source and is not a fixed value.
 
30 and 60 is obvious

60 and 85 I can tell.
First by Crysis firing animations, and then running my desktop at 60Hz makes my eyes hurt.

I'm spoiled on 85Hz :(
 
30 and 60 is obvious

60 and 85 I can tell.
First by Crysis firing animations, and then running my desktop at 60Hz makes my eyes hurt.

I'm spoiled on 85Hz :(


yep, 60hz hurts my eyes too.

I can definetly tell the difference over 30fps, everyone can. It is just a matter of how the monitors refresh-rate displays the frames per second for the human-eye to decode. Im not going to read the article, but I can definetly see over 30fps.

Its kinda like when I watch those new 1080p tvs that use the "120hz technology" with blue-ray discs... I know the movies were filmed at 24fps, but with the 120hz added they look like they are in a strange fastforward... It drives me mind NUTS, because the picture is fantastic, but any motion is so so weird.
 
30 fps is so obviously choppy, but it's a playable choppy, just like movies are a watchable choppy.

60+ fps is butter smooth. Objects move from one pixel to the next with no skipping of pixels involved. Well, maybe if you're playing a twitch game where the background moves really fast, but for your average game, 60+ fps is as smooth as it gets. :D
 
although my g92 gts can destroy most games out there I still run with vsync on and play capped at 60. It just feels much better imho. When games' fps are going crazy into the hundreds the screen seems a bit 'choppy' at times I think its called tearing. Am I wrong? I play on a 8ms viewsonic vx2025...
 
Motion blur will fool the human brain into thinking a transition was smooth even if it wasnt. If a game had perfect motion blurr you could probably play it at like 24fps perfectly. Since most dont, and since if you are trying to fix a poor frame rate with it, youll end up lowering it even further to the point where the blur doesnt help, its not much help in games.
 
although my g92 gts can destroy most games out there I still run with vsync on and play capped at 60. It just feels much better imho. When games' fps are going crazy into the hundreds the screen seems a bit 'choppy' at times I think its called tearing. Am I wrong? I play on a 8ms viewsonic vx2025...

That's what V-sync is for. Your monitor has not finished displayed one frame and another one is already going into it. That's how tearing occurs.
 
I beleive that the reason most poeple notice the difference between 60 and 80fps is becuase if your computer is constantly jumping between the two, of course you'll notice it, however if you cap your fps and you stay at that cap it is nigh on impossible to tell the difference. For example, with my FPS capped, I can't tell the difference between 40fps and 60fps, however once it starts jumping around, I definately notice. I think its because your eyes get used to a certain FPS, then when it jumps around your eyes have to adjust.

Note: I am not saying that you cant see above 30fps, just that it is very hard to tell the difference beyond that.
 
sony also believes it's over 200fps (around 240fps)
http://www.fe-tech.co.jp/en/pdf/FET_PressRelease_E_20070803.pdf

and we had the same discussion as part of this thread:
http://www.hardforum.com/showthread.php?t=1242258

12-15fps = motion
24fps = old cinema projection that has not been abolished yet (about twice of what is perceived as motion and not slideshow)
25 and 30fps (50 and 60fps/Hz PAL/NTSC) = based on the frequency of the local electrical system, has almost nothing to do with research about perception of movement.
85 to 120Hz/fps the point where many peoples consciousness stops noticing flicker. (although the eyes/brain might still do so)

People in medieval times considered a few candles to be great light for reading.
In the same way we are used to 24/30/60fps although it should be 240.
 
25 and 30fps (50 and 60fps/Hz PAL/NTSC) = based on the frequency of the local electrical system, has almost nothing to do with research about perception of movement.

Although this changed upon the introduction of IC based television sets... I guess it costs too much to change around what's already in place huh...

But yeah, you're definitely right... 240fps might be pushing it, but it's quite acceptable to feel over 100, and even 200 "frames" a second. Understanding the information in these frames, however, I don't think is possible.
 
Back
Top