So what causes screen tearing, really?

dukenuke88

[H]ard|Gawd
Joined
Jun 22, 2011
Messages
1,924
I've always been curious why some people say they use VSYNC because they don't like screen tearing...out of my 5 years of PC gaming, I have NEVER played with VSYNC on, and thats because I felt like the game was "slowing me down", instead of actually "smoothing out the tearing"....all the games I have ever played with VSYNC off have ZERO screen tearing...the only game I have witnessed screen tearing was RAGE...and the only way to fix it is enabling VSYNC on...but I heard the engine is a POS to begin with, so that isn't really a "norm"

Is screen tearing simply a game engine issue? Or is it the quality of the monitor? I'm just having a hard time figuring out why some people play with VSYNC on and say "its to prevent screen tearing".....I'm just like, okay? I have played all my games with no VSYNC and have absolutely zero screen tearing

thanks in advanced
 
sorry but your games do have some screen tearing if vsync is off. just because you do not notice it or it does not bother you does not change that fact. without vsync, screen tearing can occur any time a frame is out of sync so the refresh rate does not even have to be exceeded to tear despite what some people claim. in fact the games that I notice the most tearing in are the ones with the lowest framerate.
 
sorry but your games do have screen tearing if vsync is off. just because you do not notice it or it does not bother you does not change that fact. without vsync, screen tearing can occur any time a frame is out of sync so the refresh rate does not even have to be exceeded despite matter some people claim. in fact the games that I notice the most tearing in are the ones with the lowest framerate.

whats a good example of a game that is very noticeable? because i seriously don't notice anything tearing, what so ever...and this is over 5 years of trying different PC games

and btw, I hope I didn't come off the wrong way...I'm not saying screen tearing isn't a reality...or anything like that...I'm just saying, I can't find it in any game I've played....the only times I have found it is, when your doing some atrocious mouse movements, which 99% of the time, you won't do it in real world gaming situations.....or if the game engine absolutely sucks, like RAGE...and people say the screen tearing is normal for RAGE because its a crap engine to begin with
 
well here is a vid showing Metro 2033 tearing even at just 25-30 fps. and in the game it looks worse than in my crappy video too. fast forward to about 20 seconds and its more noticeable. http://www.youtube.com/watch?v=RkO8U8r2Tf8

many games will noticeably tear from muzzle flash or when flickering lights are encountered but as you can see just panning around can show tearing.
 
Screen tearing occurs when you have parts of two or more frames on the screen at the same time. This will be most apparent in a game that is generating hundreds of frames while you turn around quickly; you will see the bottom of the screen out of alignment with the top of the screen, as if it were trying to catch up.

If your particular setup and games don't tend to exceed 60FPS, you will likely not be able to notice any screen tearing except when the game engine is really screwed up. If you do not notice any screen tearing, then there is no benefit to enabling VSync, so just leave it off and be happy.

Special case: Skyrim doesn't allow VSync to be turned off without .ini tweaking, because the game gets all buggy, with highly sensitive vertical axis control and accelerated or glitched game world physics with it turned off unless you also use an FPS limiter which was apparently beyond Bethesda's abilities.
 
Screen tearing for me is usually bad when my frame rate goes above 60, however I would deal with any amount of screen tearing as opposed to vsync. Vsync absolutely kills it for me in anything other than rpg/rts games. The mouse lag is god awful, and nothing seems to ever help in regards to mouse rates, triple buffering, sensitivity adjustments, etc.
 
I first became aware of screen tearing on Star Trek Elite Force quite a few years ago. It seems like id's engines are particularly bad for some reason as Rage was absolutely horrible. I have used vsync ever since until recently, now I leave it off if I can. Real noticeable tearing just ruins the experience for me, where a little mouse lag I can get used to and forget about.
 
Got to say, I'm nearly 30 and spent my whole life gaming without vsync but my friend built a new system recently with 2 580's and just out of boredom one day we looked at different games with and without vsync. It's there no matter what game you play really, just hard to notice. Now I'll never play a game again without vsync. <--- Period
 
Got to say, I'm nearly 30 and spent my whole life gaming without vsync but my friend built a new system recently with 2 580's and just out of boredom one day we looked at different games with and without vsync. It's there no matter what game you play really, just hard to notice. Now I'll never play a game again without vsync. <--- Period

+1 to this.. can't stand vsync being off.
 
I first became aware of screen tearing on Star Trek Elite Force quite a few years ago. It seems like id's engines are particularly bad for some reason as Rage was absolutely horrible. I have used vsync ever since until recently, now I leave it off if I can. Real noticeable tearing just ruins the experience for me, where a little mouse lag I can get used to and forget about.
The engine makes no difference. The API has no control over the synchronization of the frame rate without vertical sync, so it can't improve it or make it worse. Content, however, can. The more vibrant the image, the more perceptible the tearing. Dark games with little contrast are less noticeable.
 
Screen tearing for me is usually bad when my frame rate goes above 60, however I would deal with any amount of screen tearing as opposed to vsync. Vsync absolutely kills it for me in anything other than rpg/rts games. The mouse lag is god awful, and nothing seems to ever help in regards to mouse rates, triple buffering, sensitivity adjustments, etc.

That's pretty much how I feel.
 
Tearing> mouse lag. Exceptions are the non-reflex games, but those barely exist anymore since we only ever seem to get fps games.
 
I think if you get an old game (quake 3?), run it at 1000fps and run your monitor at 60hz (you could try 30hz if it lets you) you should be able to see for yourself what it looks like.

I personally don't mind tearing and hate vsync. So I lock my maximum fps to 125 for my 120hz and 62 on a 60hz screen.
 
OP, what is your monitor and refresh rate?

i have only owned 60hz monitors...once they start producing 120hz IPS or PLS displays, I'll move onto 120 frames per second and not worry about a thing....its pretty DAMN hard to produce 120 frames per second in all of my games...I know I can do it in alot of the older games, but most newer games, its gonna be hard....even with two GTX 580
 
I always turn it off in MP games. That split second move can cost you a kill or a death. Screen tearing is just not important to me.
 
It's strange that you can't see tearing then.

2 Druneau: Why 125 and 62?
 
It's strange that you can't see tearing then.

2 Druneau: Why 125 and 62?

I have no reasoning behind those numbers lol... I think the 125 is from quake 3 physics bug?

The 62 is a random number. I'll try exact refresh rate locks to see, but I'm thinking I go a bit over to minimize the chances of every frame getting tearing.
 
Vsync does more than just cap the frame rate to the refresh rate, it actually syncronizes the frame output from the card to the monitor and only sends the new frame when the monitor indicates it is ready to draw it (hence the sync part of Vsync). So simply capping your refresh rate at 60 won't automatically eliminate tearing.
 
You said you never used it, but you cited an experience saying it held you back :)
 
I really do hate the mouse movement of v sync.. it has a unexplainable movement sensation.... its too smooth that it goes past my target point(fps for example). So basically I suck it up and just play with tears... Is there anyway to make mouse movement sharp with v sync on?
 
Vsync does more than just cap the frame rate to the refresh rate, it actually syncronizes the frame output from the card to the monitor and only sends the new frame when the monitor indicates it is ready to draw it (hence the sync part of Vsync). So simply capping your refresh rate at 60 won't automatically eliminate tearing.

You're right. Capping FPS does not eliminate tearing.

In my experience running @ 120hz + capping framerate near refresh rate minimizes the chances of tears. I can honestly say that for the last 12months I've had my 120hz LCD I haven't noticed tearing.
 
I really do hate the mouse movement of v sync.. it has a unexplainable movement sensation.... its too smooth that it goes past my target point(fps for example). So basically I suck it up and just play with tears... Is there anyway to make mouse movement sharp with v sync on?
Mouse input polling is handled by the engine. Vertical sync has no impact on mouse polling other than to potentially limit its update rate if the input polling is coupled to the renderer's update rate (which doesn't really matter anyway, as the hardware itself is polled at fixed intervals).

What you're describing is probably just the result of the slight latency increase in the time between your mouse moves and the display reflecting that move: input lag.
 
Mouse input polling is handled by the engine. Vertical sync has no impact on mouse polling other than to potentially limit its update rate if the input polling is coupled to the renderer's update rate (which doesn't really matter anyway, as the hardware itself is polled at fixed intervals).

What you're describing is probably just the result of the slight latency increase in the time between your mouse moves and the display reflecting that move: input lag.

no way! this has always happened with me. the control does change. its not sharp.

However if it truly is latency, would sticking my mouse in 3.0 help?
 
In my experience running @ 120hz + capping framerate near refresh rate minimizes the chances of tears. I can honestly say that for the last 12months I've had my 120hz LCD I haven't noticed tearing.

That is the added benefit of 120 Hz.
 
Yeah, it's input lag. You get the same thing if you set the GPU to render a high number of frames ahead of time. Feels as if the mouse pointer( or camera, in the case of fps) has to fight inertia when you move it around, and it keeps drifting after you stop moving the mouse. Makes competitive reflex-play hopeless.
 
I thought this also is related to response times on LCD's? I don't recall screen tearing being an issue on quality CRT monitors back in the day and remember when LCD's first started appearing there was lots of screen tearing which I recall reading was related to slow response times.
 
It's possible for response times to play a small factor, I think, considering that the response time for an individual pixel will vary depending on what color it is being changed from and changed to. As the delta between response times increases, there's a possibility that a tear line could appear on the display for less time (assuming I'm thinking about it correctly).

I think it's splitting hairs, though. What's most important is what content is being torn and to what extent, which means that mostly the problem comes down to what qualities comprise the frames themselves. The content doesn't change the degree of tearing, but it can make tear lines appear more obvious.
 
I don't recall screen tearing being an issue on quality CRT monitors back in the day and remember when LCD's first started appearing there was lots of screen tearing which I recall reading was related to slow response times.

CRTs tear just as much as LCDs do. I remember lots of tearing on the FW900 I used to have when I ran with VSync off on games that generated lots of frames.
 
whats a good example of a game that is very noticeable? because i seriously don't notice anything tearing, what so ever...and this is over 5 years of trying different PC games

and btw, I hope I didn't come off the wrong way...I'm not saying screen tearing isn't a reality...or anything like that...I'm just saying, I can't find it in any game I've played....the only times I have found it is, when your doing some atrocious mouse movements, which 99% of the time, you won't do it in real world gaming situations.....or if the game engine absolutely sucks, like RAGE...and people say the screen tearing is normal for RAGE because its a crap engine to begin with

Ugh. I know it's a bit late. But stop. Stop right the fuck now.

Don't find out what screen tearing is. Once you do, there's no going back. :(
 
Framerate does NOT have to be excessively high to get bad tearing, the game that tearing first really bugged me was in Crysis and I was getting around 25-30fps on my 60Hz monitor. There's more games where I do notice tearing than I don't notice it, but Crysis it really annoyed me.

I think its more some people notice it (like me) and some people don't.
 
I thought this also is related to response times on LCD's? I don't recall screen tearing being an issue on quality CRT monitors back in the day and remember when LCD's first started appearing there was lots of screen tearing which I recall reading was related to slow response times.

It's not the response time, it's the low refresh rate of LCDs that makes it worse.
 
Vsync on at all times here too. Being uber in multiplayer isn't a priority but I can't stand shredded images.
 
I recently learned that how I thought vsync worked was wrong, and now knowing the way it really does work, I think it would be worthwhile to make sure everyone here understands it.

What is VSync? VSync stands for Vertical Synchronization. The basic idea is that synchronizes your FPS with your monitor's refresh rate. The purpose is to eliminate something called "tearing". I will describe all these things here.

Every CRT monitor has a refresh rate. It's specified in Hz (Hertz, cycles per second). It is the number of times the monitor updates the display per second. Different monitors support different refresh rates at different resolutions. They range from 60Hz at the low end up to 100Hz and higher. Note that this isn't your FPS as your games report it. If your monitor is set at a specific refresh rate, it always updates the screen at that rate, even if nothing on it is changing. On an LCD, things work differently. Pixels on an LCD stay lit until they are told to change; they don't have to be refreshed. However, because of how VGA (and DVI) works, the LCD must still poll the video card at a certain rate for new frames. This is why LCD's still have a "refresh rate" even though they don't actually have to refresh.

I think everyone here understands FPS. It's how many frames the video card can draw per second. Higher is obviously better. However, during a fast paced game, your FPS rarely stays the same all the time. It moves around as the complexity of the image the video card has to draw changes based on what you are seeing. This is where tearing comes in.

Tearing is a phenomenon that gives a disjointed image. The idea is as if you took a photograph of something, then rotated your vew maybe just 1 degree to the left and took a photograph of that, then cut the two pictures in half and taped the top half of one to the bottom half of the other. The images would be similar but there would be a notable difference in the top half from the bottom half. This is what is called tearing on a visual display. It doesn't always have to be cut right in the middle. It can be near the top or the bottom and the separation point can actually move up or down the screen, or seem to jump back and forth between two points.

Why does this happen? Lets take a specific example. Let's say your monitor is set to a refresh rate of 75Hz. You're playing your favorite game and you're getting 100FPS right now. That means that the mointor is updating itself 75 times per second, but the video card is updating the display 100 times per second, that's 33% faster than the mointor. So that means in the time between screen updates, the video card has drawn one frame and a third of another one. That third of the next frame will overwrite the top third of the previous frame and then get drawn on the screen. The video card then finishes the last 2 thirds of that frame, and renders the next 2 thirds of the next frame and then the screen updates again. As you can see this would cause this tearing effect as 2 out of every 3 times the screen updates, either the top third or bottom third is disjointed from the rest of the display. This won't really be noticeable if what is on the screen isn't changing much, but if you're looking around quickly or what not this effect will be very apparant.

Now this is where the common misconception comes in. Some people think that the solution to this problem is to simply create an FPS cap equal to the refresh rate. So long as the video card doesn't go faster than 75 FPS, everything is fine, right? Wrong.

Before I explain why, let me talk about double-buffering. Double-buffering is a technique that mitigates the tearing problem somewhat, but not entirely. Basically you have a frame buffer and a back buffer. Whenever the monitor grabs a frame to refresh with, it pulls it from the frame buffer. The video card draws new frames in the back buffer, then copies it to the frame buffer when it's done. However the copy operation still takes time, so if the monitor refreshes in the middle of the copy operation, it will still have a torn image.

VSync solves this problem by creating a rule that says the back buffer can't copy to the frame buffer until right after the monitor refreshes. With a framerate higher than the refresh rate, this is fine. The back buffer is filled with a frame, the system waits, and after the refresh, the back buffer is copied to the frame buffer and a new frame is drawn in the back buffer, effectively capping your framerate at the refresh rate.

That's all well and good, but now let's look at a different example. Let's say you're playing the sequel to your favorite game, which has better graphics. You're at 75Hz refresh rate still, but now you're only getting 50FPS, 33% slower than the refresh rate. That means every time the monitor updates the screen, the video card draws 2/3 of the next frame. So lets track how this works. The monitor just refreshed, and frame 1 is copied into the frame buffer. 2/3 of frame 2 gets drawn in the back buffer, and the monitor refreshes again. It grabs frame 1 from the frame buffer for the first time. Now the video card finishes the last third of frame 2, but it has to wait, because it can't update until right after a refresh. The monitor refreshes, grabbing frame 1 the second time, and frame 2 is put in the frame buffer. The video card draws 2/3 of frame 3 in the back buffer, and a refresh happens, grabbing frame 2 for the first time. The last third of frame 3 is draw, and again we must wait for the refresh, and when it happens, frame 2 is grabbed for the second time, and frame 3 is copied in. We went through 4 refresh cycles but only 2 frames were drawn. At a refresh rate of 75Hz, that means we'll see 37.5FPS. That's noticeably less than 50FPS which the video card is capable of. This happens because the video card is forced to waste time after finishing a frame in the back buffer as it can't copy it out and it has nowhere else to draw frames.

Essentially this means that with double-buffered VSync, the framerate can only be equal to a discrete set of values equal to Refresh / N where N is some positive integer. That means if you're talking about 60Hz refresh rate, the only framerates you can get are 60, 30, 20, 15, 12, 10, etc etc. You can see the big gap between 60 and 30 there. Any framerate between 60 and 30 your video card would normally put out would get dropped to 30.

Now maybe you can see why people loathe it. Let's go back to the original example. You're playing your favorite game at 75Hz refresh and 100FPS. You turn VSync on, and the game limits you to 75FPS. No problem, right? Fixed the tearing issue, it looks better. You get to an area that's particularly graphically intensive, an area that would drop your FPS down to about 60 without VSync. Now your card cannot do the 75FPS it was doing before, and since VSync is on, it has to do the next highest one on the list, which is 37.5FPS. So now your game which was running at 75FPS just halved it's framerate to 37.5 instantly. Whether or not you find 37.5FPS smooth doesn't change the fact that the framerate just cut in half suddenly, which you would notice. This is what people hate about it.

If you're playing a game that has a framerate that routinely stays above your refresh rate, then VSync will generally be a good thing. However if it's a game that moves above and below it, then VSync can become annoying. Even worse, if the game plays at an FPS that is just below the refresh rate (say you get 65FPS most of the time on a refresh rate of 75Hz), the video card will have to settle for putting out much less FPS than it could (37.5FPS in that instance). This second example is where the percieved drop in performance comes in. It looks like VSync just killed your framerate. It did, technically, but it isn't because it's a graphically intensive operation. It's simply the way it works.

All hope is not lost however. There is a technique called triple-buffering that solves this VSync problem. Lets go back to our 50FPS, 75Hz example. Frame 1 is in the frame buffer, and 2/3 of frame 2 are drawn in the back buffer. The refresh happens and frame 1 is grabbed for the first time. The last third of frame 2 are drawn in the back buffer, and the first third of frame 3 is drawn in the second back buffer (hence the term triple-buffering). The refresh happens, frame 1 is grabbed for the second time, and frame 2 is copied into the frame buffer and the first part of frame 3 into the back buffer. The last 2/3 of frame 3 are drawn in the back buffer, the refresh happens, frame 2 is grabbed for the first time, and frame 3 is copied to the frame buffer. The process starts over. This time we still got 2 frames, but in only 3 refresh cycles. That's 2/3 of the refresh rate, which is 50FPS, exactly what we would have gotten without it. Triple-buffering essentially gives the video card someplace to keep doing work while it waits to transfer the back buffer to the frame buffer, so it doesn't have to waste time. Unfortunately, triple-buffering isn't available in every game, and in fact it isn't too common. It also can cost a little performance to utilize, as it requires extra VRAM for the buffers, and time spent copying all of them around. However, triple-buffered VSync really is the key to the best experience as you eliminate tearing without the downsides of normal VSync (unless you consider the fact that your FPS is capped a downside... which is silly because you can't see an FPS higher than your refresh anyway).

I hope this was informative, and will help people understand the intracacies of VSync (and hopefully curb the "VSync, yes or no?" debates!). Generally, if triple buffering isn't available, you have to decide whether the discrete framerate limitations of VSync and the issues that can cause are worth the visual improvement of the elimination of tearing. It's a personal preference, and it's entirely up to you.

This
 
Makes you wonder if they should make monitors that don't refresh in horizontal fashion, but instead choose some sort of "random" pattern to refresh the pixels that would give de-synced systems more of a dithered motion blur effect rather than a screen tear.
 
Back
Top