Maximum PC ATI VS Nvidia Image quality

FlyinBrian

Gawd
Joined
Jul 1, 2004
Messages
733
I knew I wasnt just crazy that my old ATI card looked better back in the Day. Here is a double blind image quality test from maximum pc and ATI got picked the most. Click
 
A group of 15 people is not what anyone who knows anything about sampling would call "representative".

Your numbers need to get up to around 100 before they show legitimate trends...

Many people seem to be under the impression that ATI's color quality is slightly better. This is because by default, NVIDIA has slightly less vivid colors (or ATI has slightly oversaturated colors). This is quickly remedied with the digital vibrance slider in the NVIDIA control panel. Default colors and contrast != all that the cards have to offer.
 
Good article but I would like to see at least 100 applicants.

I always heard the ATI image was sharper and I could see the difference from switching to a 8800gt. I lost the vibrant colors on the desktop.
 
Good article but I would like to see at least 100 applicants.

I always heard the ATI image was sharper and I could see the difference from switching to a 8800gt. I lost the vibrant colors on the desktop.

Well, it's not a matter of the card's capabilities, it's just a matter of the default settings. There is no "NVIDIA" color or "ATI" color. It's just how they choose to ship their drivers default colors. Both are very tweakable, and you can make ATI look like NV default colors and NV look like ATI default colors.
 
The LCD you are viewing on (and any change in colour balance) will affect the colours.
No RGB output is perfect, there are many points in the graphics pipeline that ends up with the CRT/LCD where the colour balance can shift so its not necessarily the fault of the graphics card if differences are perceived.

For example, Red isnt just red, its a whole spectrum of red colours.
The shade for a certain colour will be slightly different between cards, they may both look excellent on a perfect RGB output monitor.
However, if one of those shades is not as vibrant on your LCD, you might think the graphics card is the problem.

I changed from ATI to NVidia and there is a tiny difference between the 2 on a good CRT but its not worth writing home about.
I am using an analogue connection btw.
 
Good article but I would like to see at least 100 applicants.

I always heard the ATI image was sharper and I could see the difference from switching to a 8800gt. I lost the vibrant colors on the desktop.

Digital vibrance is your friend. Seriosuly, adjust your color settings if they're not where you like them. Most adjustments should be done at the monitor. What can't be done there can usually be handled with overlay and other settings in the driver control panel.:)

Other than that, I thought that IQ between 8800's and my x1900 has been indistinguishable for all practical gaming and desktop purposes.
 
Digital vibrance is your friend. Seriosuly, adjust your color settings if they're not where you like them. Most adjustments should be done at the monitor. What can't be done there can usually be handled with overlay and other settings in the driver control panel.:)

Other than that, I thought that IQ between 8800's and my x1900 has been indistinguishable for all practical gaming and desktop purposes.

My point exactly. It's just that it seems everything gets graded by "defaults" and maybe NVIDIA could stand to saturate their defaults a bit more to put this whole debate to rest.

That's really all we're talking about... default color settings, since it's all entirely adjustable anyway.

We should all know how little "defaults" mean..
 
And don't forget that of the 15, 3 of them looked at ATI vs ATI or NV vs NV and still said that one was better than the other. If you get picked to take a test on image quality, there is going to be an unconscious need to pick a "winner" or find a difference even if one doesn't exist.
 
One thing that hasn't been pointed out is that they color calibrated the monitors using electronic color calibration device.

"We set the brightness controls to the same values, and then calibrated the two monitors using a Pantone HueyPro calibration kit."

So having different default color saturations should have made no difference as it would have been color corrected to the same color on both monitors. Now I know there are variables in how they color corrected so it could still be different, but if it was done properly they should look the same color wise.

Though one thing to point out is that image quality is a very subjective thing. So there may just be minor differences that to some people are big deals. The only way to really say for sure would be to have a computer technically evaluate it.
 
.

Though one thing to point out is that image quality is a very subjective thing. So there may just be minor differences that to some people are big deals. The only way to really say for sure would be to have a computer technically evaluate it.

True that. What that review really said to me was that the quality is good enough on both cards to make no real difference. If it had gone solidly one way or the other then maybe it would matter, but ...
 
Yeah with totals of 21 (ATI) vs. 15 (NVIDIA) vs. 9 (no preference). Seems too close to call
 
Sounds pretty unscientific to me, I think people were probably chosing one just because they were told to
 
I noticed that the image quality was little sharper on an ATI than Nvidia when I swapped a couple of years back...on the desktop anyway, gameplay wise you wouldn't notice much.
 
I knew I wasnt just crazy that my old ATI card looked better back in the Day. Here is a double blind image quality test from maximum pc and ATI got picked the most. Click

But that is all subjective! IT HAS NO VALUE IN THE LAND OF THE VIDEO CARD! Hehe, had to do it.

I think that is a good test. You simply have to take it for what it is, a bunch of opinions from people that SHOULD know something about IQ.

From the overall results I would tell you that there is basically no difference.....kinda like we have been telling you for a good year or so now. ;)

Intel did a huge subjective test like that a few years ago using games but being more focused on gaming performance rather than IQ. They did not share the results but I did take part in the testing.
 
These kind of tests are absolute bogus when it comes to debating about IQ.

Most people prefer the 360's color output to the PS3, and call the PS3 output washed out in comparison. What is actually happening though, is that the 360 gamma count is set much higher than the PS3's, and the 360 is crushing blacks as well. This results in added vibrance to the 360's image, but a definite loss in image quality and detail.

Same thing happens in regards to sound. Many people seem to like bloated bass and mudded midrange a la Bose, but this sound is actually regarded to be terrible in the world of quality audio.
 
No, its crap marketing.

Ok, it's a fancy name for saturation, which is an adjustment that has been around for decades. It's still useful if you have a POS monitor that always looks washed out, no matter what you do. So, no need throw a blanket statement like that out. It's a case-by-case thing. If you don't need it, great.
 
I wasnt saying ati was superior in my first post and I am sorry it may of sounded that way just that I could note a slight better image in my old ati card and It was cool in this test they did confirmed that quite a few people chose the ati card. The one that stood out in my mind was the art teacher immediately picked the crossfire out of almost all the test.

So as they said in the article.. its all in the eye of the beholder. In games its not gona make or break anything. I just thought the article was interesting.
 
Back
Top