VSync / Tearing

TotalLamer

Gawd
Joined
Feb 21, 2009
Messages
761
So with the setup in my sig, I am pretty much forced to turn V-Sync on in most games... WoW, Crysis to a lesser extent... but especially in Oblivion. Without V-Sync, the screen tearing I experience in Oblivion is AWFUL. What exactly causes screen tearing? Is it something that is based on the monitor you have, or... ?
 
LCDs are stuck at 60Hz/60fps screen update, but when game has faster FPS than that, the LCD cant keep up and cause tearing. This is in the nutshell, someone can perhaps give more detailed explanation how the tearing happens technically. Sorry, this cannot be fixed other than using Vsync, or if you dont like the side effects of Vsync, get one of those just released 120Hz LCD monitors. (I doubt many of your games can exceed 120fps speed anyway).
 
If this is just what happens when you exceed the monitor's refresh rate, I assume most others must be having the same tearing issues... how can you stand it? Oblivion is practically unplayable until I turn on V-Sync.

As far as going over 120 FPS... I've seen over 250 from time to time in WoW when in the middle of nowhere looking out towards the edge of the continent, haha.
 
When I used an LCD, Vsync was required much more often for gaming.
Atm I am using a CRT and a Plasma TV and occasionally need Vsync.

Just the way of things with LCDs (in my experience) although some seem to exhibit the problem more than others.
 
When I used an LCD, Vsync was required much more often for gaming.
Atm I am using a CRT and a Plasma TV and occasionally need Vsync.

Just the way of things with LCDs (in my experience) although some seem to exhibit the problem more than others.

Yeah, just seems weird to me... I notice most everyone says they game with V-Sync turned off.
 
Yeah, just seems weird to me... I notice most everyone says they game with V-Sync turned off.

Some people notice it less or are not as tuned into it.
Others will be forgiving of not using it if they get the smooth framerate they desire.
A few notice the extra lag using the buffering (triple buffer etc to help get high framerate while locked to the display frequency) and cannot live with the lag so need to have VSync disabled.

I'm not overly fussy and will forsake some tearing for extra framerate if needed but even with a CRT there are times when its not pretty without Vsync.
 
I will toss my 2 cents into this thread. I, too, must use Vsync in all of my games now. Once I did this, I experienced a very slight "mouse lag". It just didnt feel fluid to me. I discovered that I must set the maximum FPS to 59 for my games. Somewhere in the config files you have to find "max framerate" or fps_max or something to limit the fps to 59. Once I did this there was ZERO lag.
 
I dunno about just the over 60fps=tearing with LCDs that have 60hz refresh, because I have one that is 60hz and in games like CSS and the COD series I easily get over 60fps, yet never see tearing with v-sync disabled.
 
I dunno about just the over 60fps=tearing with LCDs that have 60hz refresh, because I have one that is 60hz and in games like CSS and the COD series I easily get over 60fps, yet never see tearing with v-sync disabled.

oh its there.

if you can try mirrors edge. that game for some reason is very bad tearing with v-sync off
 
I'll install it tomorrow.

perhaps, i'm not a picky person

even on a crt 100hz vsync off, i could see the tearing in css.

when you are in the heat of action. you tend not to notice it.

there a rivatuner setting that tweaks the vsync to change the tearing? anyone goof with that to see if that does anything on todays videocards & lcd?

at work atm so i cant test
 
It has been mentioned further up, but yes turning vsync on and triple buffering solves most all tearing issues. It's available in the NVidia control panel for all recent drivers (like the past 2 years). ATI also has a triple buffer for D3D.

Keep in mind that not all LCD's are alike. Certain panels present tearing worse than others. It depends on the amount of image processing that takes place by the panel's chips before you see it and the amount of delay. On a standard 5ms HP w2408h, I don't notice tearing with vsync off most of the time. It's only sudden up and down view moments that trigger any sort of tearing at high FPS. But on the other hand, my older Viewsonic VX922 displays noticeable tearing in many cases.

I've also found tearing is highly dependent on the video card as well. Even the same model by different manufacturers can have different vsync characteristics. Take for example an old FX5600 I used to have. Even on a CRT I could notice tearing like crazy at high refresh rates, but switch a while later to an FX5700 (practically the same chip, newer rev) and while I'm not getting a much higher framerate, the tearing is gone. I noticed this constantly as I switched video cards over the years with a reliable CRT.
 
I can still see tearing even at beyond 60fps on a 60Hz LCD monitor.

The higher the framerate, the less visible the tearlines become because the images move less between frames as the next images splices into the display.

But tearing is still visible beyond 60fps, just diminished. At say, 120fps on a 60Hz display, there are up to two tearlines per frame (that are half as visible in the same situation as the approximately one-tearline-per-frame at ~50-70fps on a 60Hz display). At 180fps on a 60Hz display with VSYNC OFF, there are up to three tearlines per frame that are one-third as visible. Please note, tearlines move randomly and can move offscreen (above the top/bottom of the screen), so the number of tearlines per frame can vary from frame to frame. This is because at 180fps on 60Hz, you're rendering 3 frames every refresh, so one-third of each frame gets spliced onto each other.

As a result, when I don't care about being 2 milliseconds faster than the next person in a deathmatch, I'll always play with VSYNC ON for single-player, if my graphics card can render consistently at 60fps. (Basically Radeon HD 4870 in Bioshock with everything turned on, will usually play at rock solid 60fps from start to end at 1920x1200 -- but this won't work in games like Crysis)

VSYNC OFF is often preferred by network gamers, for a different reason: slightly reduced latency:
Although I don't deathmatch as much as I used to, I can attest VSYNC OFF definitely gives a few milliseconds advantage in deathmatches (At 60Hz, the *theoretical* best case advantage is 16 milliseconds -- turning VSYNC off tantamount to reducing your ping by 16 milliseconds -- 1/60th of a second is 16ms.) When both players are of a similiar skill, the reduced latency of VSYNC OFF makes an imperceptably noticeable difference -- you can't really quite feel it, but it shows up as a few extra frags when two snipers see each other at exactly the same time, the person who presses the fire button 16 milliseconds earlier because the graphics card delivered it to the display earlier, will be the one who kills the other gamer. Although the theoretical advantage is 16ms, In actual practice, VSYNC off, on a really good graphics card, only gives you an average of 8ms player's advantage in 3D gaming, when your display is running at 60Hz. Note, the calculation of 16ms advantage in the best-case scenario is based on when the next frame finishes rendering JUST AFTER the previous frame starts being sent to the display. With VSYNC ON, (as from the tweakguides article), it has to wait till the next refresh. That is 1/60th of a second (16ms). But if VSYNC is OFF, the next frame can interrupt (splice-tearline shows here) the previous frame that's already being displayed in a top-to-down scanned fashion. The best case scenario (Seeing your enemy a full 16ms sooner than the enemy can see you) occurs when the tearline is near the top of the screen, because you see most of the next frame ahead of the enemy who has VSYNC ON. (assuming all things equal -- both players have exact same skill, equipment, lag, identical human reaction time etc). If each human has a reaction time of 200 milliseconds, if both persons fire their guns exactly 200ms after they see the image they finally see on the monitor, the one that presses the fire button 8ms sooner (because of VSYNC OFF), is often the one that gets the frag, assuming the game isn't programmed to compensate for the advantage that VSYNC OFF gives. The worst case scenario occurs when the tearline is at the bottom of the screen, you don't have much of an advantage ahead of the other gamer. As tearlines move up/down randomly, the average, for a 60Hz display is an 8ms gaming advantage, or the equivalent of reducing your ping by about 8ms. A good understanding of the tweakguides article (pretty accurate) is needed to understand the implications of the gaming advantage that turning VSYNC OFF gives to competition deathmatch players...

NOW.... Assuming casual gaming (reaction time not critical), assuming good monitor, assuming 3D card. that never dips below 60fps for a particular game you're playing, I can attest VSYNC ON is usually preferred for casual single-player gaming -- it's very silky smooth this way. (You need a graphics card and CPU that doesn't frequently make the 60fps dip to 30fps) An excellent example is a Radeon HD 4870 being used on a 24" 1920x1200 monitor, that is an example of a graphics card that VSYNC ON tends to look better than VSYNC OFF, for casual gaming applications, at until you hit major-horsepower games such as Crysis. (And even so, some still prefer VSYNC ON even when there's occasional jumps from 60fps occasionally to 30fps). Now, when you're playing a borderline game, one that causes 60fps to randomly jump down to 30fps, introducing major stutter, then definitely this kind of game is one that VSYNC OFF would look better on. The preferred choice of VSYNC is very performance-sensitive.
 
Back
Top