DVI port on nVidia card not so good?

blix9

Weaksauce
Joined
Aug 30, 2004
Messages
113
I've been ATI fan for some time now but the real reason I would hesitate to get a recent nVidia card is not just my preference.

Few months ago, I've read an aticle about DVI connector of nVidia being not up to the standard whereas ATI's are more than acceptable in terms of data integrity of the signal going through DVI port. I think it was one of the Tom's hardware article and many people might have read it in the past. Not that I blindly trust their reviews though. Sorry I can't find the link to the web page at the moment. It was pretty in-depth testing of DVI port performance of several cards from either nVidia and ATI. The result of the nVidia cards was rather worrisome if you plan to use an LCD through their DVI port. I was wondering if nVidia made some improvement on that department or clarified that THGs are wrong or something like that. Has anyone heard any new report/updates on this issue since then? Like I said, I'm sure many of you have read it before.

Since I have a brand new DELL 2405FPW(Oh Yeah!!!) perfect signal output from the DVI port is more important than a few more frame rates in Doom3.

Any comment is appreciated.
 
http://graphics.tomshardware.com/graphic/20041129/

The result of our DVI compliance test is positive across the board, with all six cards reaching DVI compliance. However, while the three ATI based cards provided by ABIT and ATi turned in exemplary results, MSI's NVIDIA based cards are only able to reach DVI compliance in UXGA at a reduced frequency of 141 MHz and using a reduced blanking interval. This greatly limits the NVIDIA cards' "upward mobility" - since they don't have enough reserves for TFT displays with higher native resolutions than UXGA (1600x1200). The MSI NX6800 card only reached compliance at 162MHz when a separate TMDS transmitter chip was used. Counting these results, it seems that ATi's integrated TMDS transmitter is superior to NVIDIA's implementation. Yet the MSI cards' eye diagrams displayed a turbulent distribution of the data even when the SiL 164 TMDS transmitter was used. This, in turn, limits the maximum usable cable length, especially when cheaper cables are used.
 
It aint toms that tested the cards for DVI compliance, it was done by Silicon Image, if you dont trust tom's writing just look at the images and make up your own mind about it.....

here is another one from extremetech, it's old though
http://www.extremetech.com/article2/0,1558,1367918,00.asp

Maybe hard could start sending in cards to SI for testing, it's rather intresting knowing if the cards have a good dvi output since more and more ppl start using higer res then 1600*1200
 
November 29 2004 - Given that the 7800GTX has been specifically aimed at higher resolutions than 1600x1200 I would guess that they have overhauled it,.
 
dgb said:
November 29 2004 - Given that the 7800GTX has been specifically aimed at higher resolutions than 1600x1200 I would guess that they have overhauled it,.

I hope so and that would be a logical guess. I wish they did another testing though. It would be reassuring to see another testing/report on this matter with newer cards again to confirm that nVidia rectified the problem and ATI is maintaining the DVI performance with newer models. Don't you think?

Also, I did not know the Extremetech article came with the same result.
 
The DVI works just fine on my nVidia cards. Both a GeForce3 and 7800GTX do DVI just fine on my 1680x1050 Dell 2005FPW.
 
as long as it isnt messed up, artifacting or has wierd lines/ blobs or whatever on the screen , the Dvi port is perfectly fine .. and if it isnt for you.. go make a better DVI port!
 
Back
Top