VSynch handling and Doom3

caesardog

Weaksauce
Joined
Aug 2, 2004
Messages
68
I'm noticing slight tearing when I have VSynch off. What I see is, when looking at a corner of a wall or door, and then looking right/left (pretty slowly), I notice disjointing in the vertical structure of the wall or door.

This does not happen when vsynch is ON. However, in running timedemos, there is definitely a 7 to 10 fps hit by enabling Vsynch. So I prefer to run with Vsynch off.

But isn't there a way to avoid tearing without enabling Vsynch? Do I need to set my monitor refresh rate at 60 hz? Note that I have it set at 85 hz (using reforce.exe). I tried 100 hz, and saw the same thing. I have a CRT monitor.

I play at 1024 x 768 (high quality). I have a Geforce FX 5900 XT. I notice the tearing whether at 800 x 600 or 1024 x 768.

So what is the best way to avoid this, without enabling VSycnh -- or is that impossible?

I thought if your monitor refresh is set ABOVE the max fps (which is 60), you wouldn't see tearing. But I do, and my monitor is set at 85 hz (well above 60).

Anyone have any ideas?
 
VSync is the only way to do it. Most people think VSync arbitrarily caps the framerate, but in actuality the backbuffer is used to store the frame and delay until the next cycle comes up instead of just spitting them out as soon as they're rendered. Anyhow, the performance hit comes from a problem where if the machine can't sustain a framerate above the refresh rate, the framerate drops to a division of the refresh rate (i.e. 85Hz can drop to 42 fps or 28 fps). The only way to alleviate this is by using triple buffering. This introduces a third buffer so that a sort of "buffer underrun" doesn't occur as with double buffering does. There is one catch, however. Due to the nature of buffering, there is a latency introduced that may feel like input lag and accuracy with the mouse may go down, but most people don't notice it. If you have an ATI card, you can easily enable it by going to the 3D tab, selecting OpenGL and hitting the compatibility button where you will see an option to enable TB. With an nVidia card, I am not sure, it might require a third party utility such as RivaTuner. Happy tweakin' :cool:
 
I can't say for sure whether there's a solution other than vsync, but how important are those FPS that you're losing? I noticed the same problem and vsync fixed it. I haven't run any timedemos (I'm not really all that interested in numbers, I just need the game to feel smooth and look good) so I don't know what kind of a performance hit I'm taking, but whatever it is I don't notice it while playing.

I guess I should also note that I have a 6800GT so that might explain why there's not noticeable difference between w/ and w/o vsync.
 
You are going to need vsync. I think what you are recognizing is merely dealing with the way timedemo's work. They are not exactly like game playing because a static number of frames are rendered as fast as they can be. No frames are skipped, nor movement slowed down if the card can keep up. There could be brief moments where you even go beyond the cap of the game. Notice that one part in the hall, after defeating a monster you might note there is even a second or two where our hero dude flys up the corridor because there is no lighting etc. I think in gameplay you'd have a hardtime breaking the 40fps barrier barely at all and vsync will sync you up. Something like this maybe, Im not sure. But vsync should not hurt framerate.
 
PliotronX said:
VSync is the only way to do it. Most people think VSync arbitrarily caps the framerate, but in actuality the backbuffer is used to store the frame and delay until the next cycle comes up instead of just spitting them out as soon as they're rendered. Anyhow, the performance hit comes from a problem where if the machine can't sustain a framerate above the refresh rate, the framerate drops to a division of the refresh rate (i.e. 85Hz can drop to 42 fps or 28 fps). The only way to alleviate this is by using triple buffering. This introduces a third buffer so that a sort of "buffer underrun" doesn't occur as with double buffering does. There is one catch, however. Due to the nature of buffering, there is a latency introduced that may feel like input lag and accuracy with the mouse may go down, but most people don't notice it. If you have an ATI card, you can easily enable it by going to the 3D tab, selecting OpenGL and hitting the compatibility button where you will see an option to enable TB. With an nVidia card, I am not sure, it might require a third party utility such as RivaTuner. Happy tweakin' :cool:

So how come with 100 hz on my monitor, I see pretty much the exact same performance as with 85 hz? Since the division of 100 hz should take it to 50 fps or 33.3 fps, shouldn't I be better off a higher monitor refresh rate (assuming I leave vsynch off)?

I do have an Nvidia card -- is there a way to enable triple buffering via the drivers (anyone know)?
 
Dijonase said:
I can't say for sure whether there's a solution other than vsync, but how important are those FPS that you're losing? I noticed the same problem and vsync fixed it. I haven't run any timedemos (I'm not really all that interested in numbers, I just need the game to feel smooth and look good) so I don't know what kind of a performance hit I'm taking, but whatever it is I don't notice it while playing.

I guess I should also note that I have a 6800GT so that might explain why there's not noticeable difference between w/ and w/o vsync.

Well in timedemo demo 1, I go from 38.5 to 30 fps (when enableing vsynch) in 1024 x 768 (high quality). I like to have a little more of a buffer above 30 FPS to have enough in reserve when a lot of action starts happening.
 
texuspete00 said:
You are going to need vsync. I think what you are recognizing is merely dealing with the way timedemo's work. They are not exactly like game playing because a static number of frames are rendered as fast as they can be. No frames are skipped, nor movement slowed down if the card can keep up. There could be brief moments where you even go beyond the cap of the game. Notice that one part in the hall, after defeating a monster you might note there is even a second or two where our hero dude flys up the corridor because there is no lighting etc. I think in gameplay you'd have a hardtime breaking the 40fps barrier barely at all and vsync will sync you up. Something like this maybe, Im not sure. But vsync should not hurt framerate.

VSynch definitely does hurt framerate in gameplay -- not just timedemo. I enabled showfps in gameplay with vysynch ON and then vsynch OFF (you can do it on the fly via the menu -- no restart necessary). Standing in the same spot and looking at the same place, I saw a significant framerate hit.

From the 50 to the 40s.

In gameplay, I'm in the 50 and 60s a lot, but times it goes down below 30 (depending on what's happening).
 
caesardog said:
Well in timedemo demo 1, I go from 38.5 to 30 fps (when enableing vsynch) in 1024 x 768 (high quality). I like to have a little more of a buffer above 30 FPS to have enough in reserve when a lot of action starts happening.

OK, in that case it does matter. I'm not sure what else to tell you. The only solution I know of is vsync. If there's another possible solution I don't know about it.
 
Vsync turned on will NOT effect game play, only time demos. The example from the 38 to 30 is a great example, so let's go with it as the baseline.

Assuming vsync is ON and refresh is at 60hz, your machine might be hitting 60fps every once in a while. But for the most part it will be sitting at 30fps chugging right along, even if it could be displayed at 38 fps (plus or minus). The only thing you notice is a lack of tearing on the screen. The tearing that I notice come from when your frame rate is jumping around (from 60fps to 39 to 43 etc). Keeping it at a multiple of your refresh rate definitely help the appearance.

If you'd turn Vsync off, you'd get the max framerate. So if it could be displaying at 38fps, you'd see it. But, since the framerate would be bouncing from a smooth 60fps down to 38, you start noticing tearing. Not good, and much worse than loosing those precious 8fps.

As for droppping below 30fps, Vsync on or off, it would still drop. The real difference is that while you are playing, you won't notice the fluxuating framerate as much. So if you are bothered by the tearing (and I sure as hell was), turn VSync on. Your timedemo will suffer because it artificially caps the framerate at 30fps or 60fps, but you'll get a much smoother experience. If you want to whip your timedemo score out and impress your friends and neighbors, turn the sucker off.
 
I just reread that buffer statement about having enough reserve framerates. That statement makes zero sense. If you encounter a screen that has enough action that it can only render at 20fps on your machine, it wouldn't matter if you had been running at 1000fps. The machine would be running at 20fps, you can't "save" frames for later use or anything.

So, assuming you enter crazy scene at 38fps, framerate drops to 20fps. Enter at 30fps, framerate still drops to 20fps.
 
enkafan said:
I just reread that buffer statement about having enough reserve framerates. That statement makes zero sense. If you encounter a screen that has enough action that it can only render at 20fps on your machine, it wouldn't matter if you had been running at 1000fps. The machine would be running at 20fps, you can't "save" frames for later use or anything.

So, assuming you enter crazy scene at 38fps, framerate drops to 20fps. Enter at 30fps, framerate still drops to 20fps.

Actually with Vsync on it drops to a Multiple of your refresh rate... For example if you are running a 60hz refresh rate and your Frames per second (FPS) drops from 60 to 59 your frame rate will actually get set to half your refresh rate i.e. 30FPS. Now if your actual rate drops below 30FPS during intense fighting it gets halved agian to 15fps. to keep in Sync with the monitor. This all assumes that Triple Buffering is not being used. Triple buffering gets around this problem. From what I have seen and read there is no current way to enable triple buffering the the 6x.xx nvidia drivers (Using rivatuner or anything). Nvidia has removed the option from the actual driver. SO if you are going to use VSYNC choose your highest refresh rate so that the multiples are Higher. I.E. half of 75hz is 37.5 FPS.

Hope that Helps

G
 
enkafan said:
Vsync turned on will NOT effect game play, only time demos.

I hear what you are saying -- but as an experiment I turned Vsynch both ON and OFF in a map -- while standing in the same place in the map.

The framerates (as shown by com_showfps 1) dropped by about 10 fps with VSynch on (looking in the same direction while on the same place on a map). So it appears that it would effect gameplay as well.
 
enkafan said:
I just reread that buffer statement about having enough reserve framerates. That statement makes zero sense. If you encounter a screen that has enough action that it can only render at 20fps on your machine, it wouldn't matter if you had been running at 1000fps. The machine would be running at 20fps, you can't "save" frames for later use or anything.

So, assuming you enter crazy scene at 38fps, framerate drops to 20fps. Enter at 30fps, framerate still drops to 20fps.

What I meant was that by starting out at a higher framerate, the drop won't be to as low a number. IN other words, if a lot of action causes a certain percentage or framerate drop -- if I'm at 50, maybe I'll go down to 35. But if I'm already at 35, it will go down into the 20s.

In other words, I'll have a higher average framerate without vsynch ON, so I won't drop quite as low in heavy duty scenes. I'll still drop, but maybe not to where it is really bad (like the low 20s).
 
gthompson20 said:
Actually with Vsync on it drops to a Multiple of your refresh rate... For example if you are running a 60hz refresh rate and your Frames per second (FPS) drops from 60 to 59 your frame rate will actually get set to half your refresh rate i.e. 30FPS. Now if your actual rate drops below 30FPS during intense fighting it gets halved agian to 15fps. to keep in Sync with the monitor. This all assumes that Triple Buffering is not being used. Triple buffering gets around this problem. From what I have seen and read there is no current way to enable triple buffering the the 6x.xx nvidia drivers (Using rivatuner or anything). Nvidia has removed the option from the actual driver. SO if you are going to use VSYNC choose your highest refresh rate so that the multiples are Higher. I.E. half of 75hz is 37.5 FPS.

Hope that Helps

G

So I should use 100 hz as opposed to 85 hz? My monitor can only go to 100 hz at 1024 x 768.
 
vsync with newer drivers makes it so that its either 60FPS or 30 FPS. If it renders ANYTHING less than 60, it gets forced to 30. if its going at ANYTHING higher than 60, it gets dropped to 60. anything less than 30 gets displayed when it can. So yes, your average framerate WILL go down with vsync and newer drivers, but its NOT rendering scenes differently or slowing them down AT ALL.
 
kronchev said:
vsync with newer drivers makes it so that its either 60FPS or 30 FPS. If it renders ANYTHING less than 60, it gets forced to 30. if its going at ANYTHING higher than 60, it gets dropped to 60. anything less than 30 gets displayed when it can. So yes, your average framerate WILL go down with vsync and newer drivers, but its NOT rendering scenes differently or slowing them down AT ALL.

So then are you saying that it would make no difference if my monitor refresh rate was changed to 100 hz from 85 hz?

Also, why would Id have Vsynch set to OFF by default if it made no difference in gaming performance, but only in timedemos?

Bottom line -- what is your recommended settings for playing Doom3, for the following:

1. Vsynch --- ON or OFF
2. Monitor refresh rate -- highest possible or doesn't really matter?
 
caesardog said:
So then are you saying that it would make no difference if my monitor refresh rate was changed to 100 hz from 85 hz?

Also, why would Id have Vsynch set to OFF by default if it made no difference in gaming performance, but only in timedemos?

Bottom line -- what is your recommended settings for playing Doom3, for the following:

1. Vsynch --- ON or OFF
2. Monitor refresh rate -- highest possible or doesn't really matter?

A) No it wouldnt make a single bit of difference, vsync or not. The game internally runs at 60 frames a second, for calculations and all

B) Because some people dont like it on. As you see it sometimes makes it 30 or 60 FPS but nothing in between

1) If the tearing bothers you, ON. I have it ON.

2)Doesnt matter for doom3 at all, however in windows and games where there isnt a cap, you want it as high as possible.
 
caesardog said:
So then are you saying that it would make no difference if my monitor refresh rate was changed to 100 hz from 85 hz?

Also, why would Id have Vsynch set to OFF by default if it made no difference in gaming performance, but only in timedemos?

Bottom line -- what is your recommended settings for playing Doom3, for the following:

1. Vsynch --- ON or OFF
2. Monitor refresh rate -- highest possible or doesn't really matter?
Do me a favor... Make sure VSYNC is on and your refresh rate is 60hz... Go into doom3 and then gointo the console type "com_showfps 1" walk around in the game for a bit... Notice howits either 60 or 30... Now go out of the game force your prefered gaming resolutions refrash rate to something higher... go into the game do the same as above... Notice anything different?

G
 
caesardog said:
What I meant was that by starting out at a higher framerate, the drop won't be to as low a number. IN other words, if a lot of action causes a certain percentage or framerate drop -- if I'm at 50, maybe I'll go down to 35. But if I'm already at 35, it will go down into the 20s.

In other words, I'll have a higher average framerate without vsynch ON, so I won't drop quite as low in heavy duty scenes. I'll still drop, but maybe not to where it is really bad (like the low 20s).
the problem is that action won't cause a drop in percentage. Action on the screen simply runs at a set FPS based on the number of triangles and lightsources (and probably a lot of other fancy crap that's beyond me). So you wander into a room that displays at 20fps after walking through a hallway at 60fps, it would be no different if the hallway had been running at 30fps.

As for the other issue of it framerate drops below 30fps, that sucks balls. It would drop to 15fps (or would it be 20?). The question then becomes: would it be better to run at 85hz, which at best would be 85 (but unlikely in my case), but 42.5 would be very common. And then when it drops to something like 29 internally, it would "only" be fixed at 21.25 by the vSync instead of 15. And like wise I could bump it up to 100hz, then I'd have 100/50/25 to work with.

That might be worth investigating. Of course, I'm stuck in Cincy on a machine running @ 800Mhz with an ATI Rage XL video card. So, I'll leave it up to the readers at home to work that out.
 
gthompson20 said:
Do me a favor... Make sure VSYNC is on and your refresh rate is 60hz... Go into doom3 and then gointo the console type "com_showfps 1" walk around in the game for a bit... Notice howits either 60 or 30... Now go out of the game force your prefered gaming resolutions refrash rate to something higher... go into the game do the same as above... Notice anything different?

G

I already did the 2nd part and it was either 60 or in the 40s (but didn't move around much to see if it would go lower than 40s). So it looks like a higher monitor refresh rate may be better. Compared to 60s and 50s without vsynch on.

So you are saying if I want 60 and 30 -- set monitor to 60 hz. If I want 60s and HIGHER than 30, set a higher monitor refresh rate?
 
gthompson20 said:
Do me a favor... Make sure VSYNC is on and your refresh rate is 60hz... Go into doom3 and then gointo the console type "com_showfps 1" walk around in the game for a bit... Notice howits either 60 or 30... Now go out of the game force your prefered gaming resolutions refrash rate to something higher... go into the game do the same as above... Notice anything different?

G

You and kronchev appear to be giving conflicting advice as to the monitor refresh rate.

Kronchev is saying it won't make a difference whether your monitor refresh is 60 or 85 or 100 -- you will only see 60 or 30 in doom3.

You seem to be saying that with a higher monitor refresh rate you will see 60 and HIGHER than 30. So who is right? I think you are!
 
caesardog said:
You and kronchev appear to be giving conflicting advice as to the monitor refresh rate.

Kronchev is saying it won't make a difference whether your monitor refresh is 60 or 85 or 100 -- you will only see 60 or 30 in doom3.

You seem to be saying that with a higher monitor refresh rate you will see 60 and HIGHER than 30. So who is right? I think you are!

no, hes not.

the game renders at 60 FPS, YOU CANNOT CHANGE THAT. id forces the refresh to 60 FOR A REASON. the reason its either 30 or 60 is because vsync says "only display completed frames", otherwise if it draws faster than it can put on the screen, youll get half of one frame, then oh no, the other one is ready, so it puts the other half on the bottom, thats what tearing is. 30 FPS works because it will put up every OTHER frame rendered, to give the rendering time to complete for each frame.
 
kronchev said:
no, hes not.

the game renders at 60 FPS, YOU CANNOT CHANGE THAT. id forces the refresh to 60 FOR A REASON. the reason its either 30 or 60 is because vsync says "only display completed frames", otherwise if it draws faster than it can put on the screen, youll get half of one frame, then oh no, the other one is ready, so it puts the other half on the bottom, thats what tearing is. 30 FPS works because it will put up every OTHER frame rendered, to give the rendering time to complete for each frame.

But doesn't your analysis assume the monitor refresh rate is at 60 hz? What if it is higher?

Also, I did see fps in the 40s while looking around on a map (via com_showfps 1) when I enabled VSynch (monitor refresh as the time was 85).

If what you say is true, wouldn't I only have seen either 30 or 60 when looking around on the map?
 
caesardog said:
I already did the 2nd part and it was either 60 or in the 40s (but didn't move around much to see if it would go lower than 40s). So it looks like a higher monitor refresh rate may be better. Compared to 60s and 50s without vsynch on.

So you are saying if I want 60 and 30 -- set monitor to 60 hz. If I want 60s and HIGHER than 30, set a higher monitor refresh rate?
Exactly ;o)
 
caesardog said:
But doesn't your analysis assume the monitor refresh rate is at 60 hz? What if it is higher?

Also, I did see fps in the 40s while looking around on a map (via com_showfps 1) when I enabled VSynch (monitor refresh as the time was 85).

If what you say is true, wouldn't I only have seen either 30 or 60 when looking around on the map?


again, it really depends on the drivers. with old omegas i dont get it, with the 4.9 cats I do. maybe its a setting somewhere, i dont know.

doom3 forces it to 60 for a reason, you have no reason to push it any higher.
 
kronchev said:
again, it really depends on the drivers. with old omegas i dont get it, with the 4.9 cats I do. maybe its a setting somewhere, i dont know.

doom3 forces it to 60 for a reason, you have no reason to push it any higher.

I have a Nvidia card -- so it may work differently.

I know I saw higher than 30 fps with vsynch enabled (monitor at 85 hz).

When I get home, I gonna try 100 hz for my monitor AND Vsynch on.

I'm not trying to get the game faster than 60 fps. I'm trying to stop it from going too LOW based on halving the monitor refresh rate -- since I can't get to 60 in many occasions (5900 xt). So if my monitor is at 100 the lowest will be 50 (hopefully), instead of 30 -- or 25 if it can't reach 50, instead of 15.

One of these days I'll stop tweaking and start JUST playing the game :p I'm still on Alpha Labs (3).
 
kronchev said:
again, it really depends on the drivers. with old omegas i dont get it, with the 4.9 cats I do. maybe its a setting somewhere, i dont know.

doom3 forces it to 60 for a reason, you have no reason to push it any higher.
That may be the issue right there.. You are using ATI I am speaking on behalf of Nvidia cards... ATI has Triple Buffering.. Nvidia does not.. I suggest you use tripple buffering with VSYNC on... Best of both worlds! I really miss it since I went from my 9800pro to my 6800GT.

G
 
gthompson20 said:
That may be the issue right there.. You are using ATI I am speaking on behalf of Nvidia cards... ATI has Triple Buffering.. Nvidia does not.. I suggest you use tripple buffering with VSYNC on... Best of both worlds! I really miss it since I went from my 9800pro to my 6800GT.

G

That was probabily the difference between drivers, good idea :)
 
A general description of various possibilities (I know I'm mostly right, but I may be off on some details):

V-Sync off:
--Monitor uses data from a buffer that is also being used to render to.
--There is no waiting at all.
--Unless rendering somehow manages to sync with the refresh on its own, there will be tearing as the monitor is refreshed with data from different frames.
--If the rendering rate is high enough, more than one tear can occurr during a refresh.
--The lag between a pixel being rendered and being displayed on the screen is determined only by how long it takes the monitor to get there in the refresh process (the worst amount of time is 1 / refresh_rate).

Double buffering:
--Monitor uses data from the current buffer to refresh the screen.
--Next frame is being rendered to back buffer.
--After the refresh is complete, if the back buffer is completed, flip buffers, else reuse the current buffer for the next refresh and continue working on rendering the next frame.
--When a frame is fully rendered in the back buffer, no further rendering can be done until after the buffers are flipped after the current refresh... this introduces major waiting.
--Since only full frames are flipped, there is (on average) more (and never less) lag between when a pixel is rendered and when it is displayed.

Triple buffering:
--Everything's pretty much the same as with double buffering, except:
--Buffers are flipped in a chain.
--When a frame is fully rendered to the buffer behind the front buffer, the next frame can start being rendered to the next buffer without waiting.
--Major waiting only happens if the last buffer is also finished before the buffers are flipped, which means that you rendered the last frame faster than the refresh rate, so waiting shouldn't do you too much damage, but it does introduce even more (on average) lag between when a pixel is rendered and when it shows on the screen. For the price of the additional lag, you get more frames rendered than you get with double buffering.
 
gthompson20 said:
Exactly ;o)

Okay, I did extensive dooming with VSync on yesterday, and with my monitor set to a refresh rate of 100 hz. This refresh rate is forced for all applications via reforce.exe ( a great tool if you don't have it).

I noticed that if my hardware couldn't handle 60 fps, it would go to 50 fps. If it couldn't deal with 50 fps, it then dropped to 33 fps. If it couldn't handle that, it went to 25 (I never saw it drop below 25; mostly it was 50 or 33).

So the way it works is, it tries the full 60 fps, then it syncs to 1/2 your monitor refresh rate, then 1/3, then 1/4. I assume if I dropped below 25, then next would be 20 fps -- or 1/5 my monitor's refresh rate.

Based on this, I think the absolute BEST monitor refresh rate for doom has to be 100 hz. Because if you set to 120 or higher, the math doesn't work out as well if you aren't pushing 60 fps. At 120 hz, it would skip the 1/2 (since if it can't handle 60, it can't handle 1/2 the refresh rate of 120) and would drop right to 1/3 (or 40 fps). While at 100 hz, at least you get the 50 fps possibility.

So for those with Vsynch on, you should use a monitor refresh rate of 100 hz (no higher and no lower). Assuming your monitor can handle it at the resolution you play at.
 
didnt read the whole thing, you guys need to go find out what vsync is cause some of you.. well nevermind. But you can try turning on r_clear and r_finish, im sure one of those will fix tearing without vsync's crazy fps limiting system.
 
that's some great info. when I get home on Friday I'll play with this. Anyone have similiar results? I'm pretty sure I could be at 50fps many times, and 33 is better that 30 and 25 is better than 20 (which I assume it would drop 60/30/20
 
Back
Top