DVI and gaming?

creepcolony

Limp Gawd
Joined
Sep 4, 2003
Messages
362
i'm thinking about getting an LCD monitor to replay my CRT but i have a question about DVI. i read that DVI helps make text less blurry, but would it help for gaming?
 
Basically, if you go LCD, you must at all costs use DVI. It greatly reduced the signal loss as a result of going from digital to analog. Everything benefits from DVI. I cannot wait to see my L90D+ on Thursday in action!!
 
This DEPENDS. A lot of us ran DVI vs. VGA tests and it varies by monitor. Some look identical with either, while others look better with DVI.
Ditto for games. Some monitors will ghost more with DVI or Analog while others won't.
There isn't a right or wrong answer because it depends on the monitor and even in some cases, the game.

For example, I have a Sony SDM-HS94P, a 12ms monitor targeted at gamers.

Well, DVI and VGA look *identical* and pretty much anyone with the monitor will verify this for you. They seem to have ever so slightly different default gamma settings but it's painfully easy to compare them using Nvidia's "clone" setting built into the detonators.

In terms of gaming, I tested Need For Speed: UG 1 and 2, HL2, Doom 3, Simpsons: Hit and Run, Star Wars: KOTR, Unreal Tournament 2K4, and Neverwinter Nights.

Some games (HL2 for instance), will ghost identically. Others like Need For Speed perform visibly better in analog. Doom and KOTOR prerformed better in DVI.

Analog will usually allow a higher refresh rate. In my case, DVI caps at 60, while analog will go to 75. Visibly this means absolutely nothing...BUT it will allow you to have a higher max FPS while keeping vsync on. That doesn't mean much, but some people really notice a difference between 60 and 75 fps. I don't.

If I had to choose one, DVI is a little easier because you don't have to align all of your resolutions manually (it's easy, but sometimes a pain if you swap drivers often). I've also noticed some high res older games don't like analog hook-ups. For instance 1280X1024X32 simply won't run on analog for me...yet DVI does fine.
 
IceWind said:
Basically, if you go LCD, you must at all costs use DVI. It greatly reduced the signal loss as a result of going from digital to analog. Everything benefits from DVI. I cannot wait to see my L90D+ on Thursday in action!!

I've tested DVI and Analog both on my L90D+ and i dont notice a difference. Others have said the same thing about this LCD.

If you use DVI your refresh rate will be 60Hz and on analog it will be 75Hz. Its something to think about when gaming because you will have to enable v-sync unless you want excessive tearing.
 
Uncheck the "hide modes..." option in settings and set the refresh at 85Hz anyway (on DVI). Its my belief that this really has nothing to do with your monitor, if you run an LCD. With the driver correctly installed for my 2001FP, I can set it as high as 85Hz. With the windows standard driver, though, I can set it to 120Hz and I use to run it there constantly. No difference in the look of the monitor. I don't know why manufacturer specs include a max refresh rate since LCDs do not refresh in frames like this, as CRTs do. I believe it amounts only to a software change when using an LCD.

If you want to try something interesting switch between your max and lowest refresh rate. Notice your mouse tracks much more smoothly on the high refresh rate (mine does anyway - noticeably).
 
ozziegn said:
VGA just looks terrible compared to DVI.

(Re)Read post number 3. Thats a very oversimplified statement.

Of course DVI is always going to be preferred but its not the devil. There are a lot of cases where you or most users on this forum wouldn't be able to tell the difference from 1.5 feet.
 
I can tell the difference.

the screens arent nowhere near as clear with VGA as opposed to DVI.

also, the colors just dont seem as vivid either.
 
I compared the same LCD side by side one on analog one on digital, and there was a noticable difference, and I've forced my refresh rate up to 120 and everything's been working great :)
 
you forced your LCD's refresh rate to 120?

I thought that 60 was pretty much the (max) standard when it came to LCD refresh rates. :confused:
 
ozziegn said:
you forced your LCD's refresh rate to 120?

I thought that 60 was pretty much the (max) standard when it came to LCD refresh rates. :confused:

im not really sure but i can relate one personal experience regarding this. the other day i was fooling around with the refrest rate in a game's registry and i put it up to 120 to see what would happen. im not really sure what it did since it looked the same, but the mouse acted very differently. this seems to suggest that something happened to the refrest rate of the screen even though it is supposedly locked at 60hz. i donno...just something odd i guess ;)
 
RawsonDR said:
Uncheck the "hide modes..." option in settings and set the refresh at 85Hz anyway (on DVI). Its my belief that this really has nothing to do with your monitor, if you run an LCD. With the driver correctly installed for my 2001FP, I can set it as high as 85Hz. With the windows standard driver, though, I can set it to 120Hz and I use to run it there constantly. No difference in the look of the monitor. I don't know why manufacturer specs include a max refresh rate since LCDs do not refresh in frames like this, as CRTs do. I believe it amounts only to a software change when using an LCD.

If you want to try something interesting switch between your max and lowest refresh rate. Notice your mouse tracks much more smoothly on the high refresh rate (mine does anyway - noticeably).

I unchecked the "hode modes" in driver properties and the only option i have for DVI is 60Hz. I'll try just using the windows driver later on when i switch to my new machine today.

Wouldn't it hurt the LCD to have it set at 85Hz or 120Hz since it was never meant to refresh like that?
 
There are certain LCD/TFT Panels that when using DVI you cannot adjust Contrast only Brightness. Whereas with VGA you can adjust both.

I have had several monitors and have had varied results but I cannot say that one has ever totally out performed the other.

Now most G-cards have both VGA and DVI so you can choose whatever you think is suitable for your needs. As I run a Dual set-up I would always ensure that my main monitor has preferance and would carry out tests to see which connection suited.
 
Again...a lot of these factors are going to vary from panel to panel. Not LCD's are created equal. Some might have lousy VGA implimentation and some might have much higher max refresh rate.

I can speak for my own, the Sony SDM-HS94P. The DVI and VGA look the same. No "more vivid colors" or any of that. Some of the Samsung panels are VGA only and they're known to have killer visual quality, among the best.

Your panel might vary, but don't act like "VGA suX0rz...digital rulez!!!!!1111!!!" when that's not necessarily the case.
 
burningrave101 said:
I unchecked the "hode modes" in driver properties and the only option i have for DVI is 60Hz. I'll try just using the windows driver later on when i switch to my new machine today.

Wouldn't it hurt the LCD to have it set at 85Hz or 120Hz since it was never meant to refresh like that?

I too would like to know if this I just set my 2001fp to 85hz would I mess up my lcd by doing this?
 
In my experience DVI is very notably better. To sum it up in a word, it's just cleaner. This cleaner signal helps in all aspects of LCD usage, my it be desktop or gaming. I suggest if you have the ability to use DVI, do use it.
 
My new L90D+ looked terrible in VGA mode, the sharpness went down drastically. DVI all the way for me.
 
IceWind said:
My new L90D+ looked terrible in VGA mode, the sharpness went down drastically. DVI all the way for me.

Actually, I take that back. For SOME reason in DVI mode, I loose resolutions in all my EA games?? I can't even go beyond 1078x768 for some reason. Yet all my other games work fine in DVI as well except the EA ones. What the fark is this all about?
 
I use DVI for my LCD, and at native res, there's simply nothing sharper.
 
RawsonDR said:
Uncheck the "hide modes..." option in settings and set the refresh at 85Hz anyway (on DVI). Its my belief that this really has nothing to do with your monitor, if you run an LCD. With the driver correctly installed for my 2001FP, I can set it as high as 85Hz. With the windows standard driver, though, I can set it to 120Hz and I use to run it there constantly. No difference in the look of the monitor. I don't know why manufacturer specs include a max refresh rate since LCDs do not refresh in frames like this, as CRTs do. I believe it amounts only to a software change when using an LCD.

If you want to try something interesting switch between your max and lowest refresh rate. Notice your mouse tracks much more smoothly on the high refresh rate (mine does anyway - noticeably).

Well does the 85hz hurt our 2001fp lcd I would like to try it out but dont want to mess up my lcd.
 
Omnikron said:
Well does the 85hz hurt our 2001fp lcd I would like to try it out but dont want to mess up my lcd.

I tried forcing my L90D+ to 75hz but it didn't do shit. owell.....
 
Upping the refresh rate on an LCD will really only do one thing - allow you to have a higher maximum number of frames per second with vsync on.

The way an LCD screen is refreshed makes your desktop and application reresh rate more or less a useless adjustment. Again, the only thing it'll affect is your max FPS and only with vsync on. I've never seen an LCD that's gone lower than 60fps, so unless you're playing onlder game where you're used to gettin 100+ fps...I doubt you'll notice any difference at all anyway.
 
Domingo said:
This DEPENDS. A lot of us ran DVI vs. VGA tests and it varies by monitor. Some look identical with either, while others look better with DVI.
Ditto for games. Some monitors will ghost more with DVI or Analog while others won't.
There isn't a right or wrong answer because it depends on the monitor and even in some cases, the game.

For example, I have a Sony SDM-HS94P, a 12ms monitor targeted at gamers.

Well, DVI and VGA look *identical* and pretty much anyone with the monitor will verify this for you. They seem to have ever so slightly different default gamma settings but it's painfully easy to compare them using Nvidia's "clone" setting built into the detonators.

In terms of gaming, I tested Need For Speed: UG 1 and 2, HL2, Doom 3, Simpsons: Hit and Run, Star Wars: KOTR, Unreal Tournament 2K4, and Neverwinter Nights.

Some games (HL2 for instance), will ghost identically. Others like Need For Speed perform visibly better in analog. Doom and KOTOR prerformed better in DVI.

Analog will usually allow a higher refresh rate. In my case, DVI caps at 60, while analog will go to 75. Visibly this means absolutely nothing...BUT it will allow you to have a higher max FPS while keeping vsync on. That doesn't mean much, but some people really notice a difference between 60 and 75 fps. I don't.

If I had to choose one, DVI is a little easier because you don't have to align all of your resolutions manually (it's easy, but sometimes a pain if you swap drivers often). I've also noticed some high res older games don't like analog hook-ups. For instance 1280X1024X32 simply won't run on analog for me...yet DVI does fine.


If the signal is analog then the signal has to be converted to digital. This may cause extra overhead depending on the setup of video card/lcd. If you use DVI you are getting a pure digital signal and translation does not have to occur, this usually means better quality even if it's 60mzh digital vs. 75mzh analog...

I have NEVER seen ghosting at all with my Sony in any game I've ever played.
 
firey-eyez said:
im not really sure but i can relate one personal experience regarding this. the other day i was fooling around with the refrest rate in a game's registry and i put it up to 120 to see what would happen. im not really sure what it did since it looked the same, but the mouse acted very differently. this seems to suggest that something happened to the refrest rate of the screen even though it is supposedly locked at 60hz. i donno...just something odd i guess ;)

LCD monitors don't refresh at given intervals. If a pixel is suppose to change its told to do so. Thus the refresh rate setting is really just a software setting - Windows will input new info to your video card 60 times a second, 85 times a second, 120, or whatever.

The mouse reference firey made above is accurate and I've mentioned this in other threads. The mouse really just feels smoother. Once I discovered this I prefer to keep my refresh rate high and I can notice the difference otherwise. Windows tracks and makes changes in your cursors position 120 times a second, 85 times a second, 60, or whatever your setting is at. So the higher the refresh rate the smoother your mouse will scroll. I really like the feeling of higher refresh rates now.

I own a 2001FP and use to keep its refresh at 120 with the Windows default driver. Now that I have the 2001FP driver installed the "non supported" mode gives me as high as 85hz, so I use that. It really only seems to be a software issue, not hardware.
 
So are you guys sure running at a non-supported refresh rate wont hurt an LCD? If thats true why do they limit it to 60Hz on Digital and 75Hz on Analog?

Using the windows driver i can bump this SOB up to 180Hz lol.
 
I've come to the conclusion that higher refresh rates not officially supported wont hurt an LCD. Its only a matter of whether or not the DVI bandwidth can support the higher refresh rate even if you can choose higher.

You can find out what the bandwidth limit is for DVI at a certain resolution by reading this article here:

http://graphics.tomshardware.com/graphic/20041129/index.html

I bumped my refresh rate from 60Hz to 75Hz on my L90D+ and it made a big difference in mouse pointer speed. Its more responsive now because of the video card sending more updates to the screen at each interval.
 
Back
Top