nvidia told reviewers to use Quality instead of High Quality? Poor AF Performance?

I have 3 6800GT's

2 are in my PCIE machine and one is in my AGP machine...both are A64's 3000, and 3700.

i get NO shimmering what so ever...i use NGO 77.72 drives from guru3d.com

i can go from HQ to Q. And there is almost no IQ differnce.

So could it be a driver issue? possibly. Could it be i run a nice high res of 1600x1200 with everything on and not notice it. Possibly.

But i have looked for it. And i don't see it. so there. sticks tounge out.
 
[RIP]Zeus said:
I have 3 6800GT's ....
But i have looked for it. And i don't see it. so there. sticks tounge out.

Be happy and don't watch further. But most likely your "problem" is that you don't have a (Ati or GF FX) card for comparison.
Notice Cali's reaction after he saw my Joint Operations Ati/Nvidia comparison vids on page 1 of this thread:

"Im starting to see things the same way as you. Playing through HL2 again right now i realize how much shimmering really is present."

Now if some of you guys think it's a driver or cp setting issue, please download Joint Operations somewhere, and Fraps, and make the same video as i did and post it here.
 
Apple740 said:
Be happy and don't watch further. But most likely your "problem" is that you don't have a (Ati or GF FX) card for comparison.
Notice Cali's reaction after he saw my Joint Operations Ati/Nvidia comparison vids on page 1 of this thread:

"Im starting to see things the same way as you. Playing through HL2 again right now i realize how much shimmering really is present."

Now if some of you guys think it's a driver or cp setting issue, please download Joint Operations somewhere, and Fraps, and make the same video as i did and post it here.


I see shimmering in both ATi and nV's videos that you made, just more in nV's.
 
razor1 said:
I see shimmering in both ATi and nV's videos that you made, just more in nV's.

Yes, nobody said Ati is total shimmerfree, it's just worse on Nv.
 
Apple740 said:
Yes, nobody said Ati is total shimmerfree, it's just worse on Nv.

What settings did you use, dling the demo now, don't know how well the video will come out since I'm on a 3500+ fraps tends to eat up the cpu quite a bit.

Through the CP or through the game I mean.

Demo seems to be out dated its not letting me log on to a server.
 
in-game everything max, 1280x1024, 4xAA/8xAF by CP. Fraps capturing "half size" (so video output is 640x512), then Xvid compressed.
The scene is taken from "Ceccao Cross Roads" in the retail game.
 
lol, not really fond of try before buy hehe. Anyways I will do some tests with Far Cry later today, and when I get my 7800.
 
The Nvidia Shimmering > ATI Shimmering is so common now that I've almost sorta gotten used to it..

sure it annoys me somedays more than others.. especially some of the roads in WoW which for some reason Shimmer something awful on my Nvidia 7800GTX's in SLI..

I guess honestly there simply isnt enough incentive to "fix it"
 
TheGameguru said:
The Nvidia Shimmering > ATI Shimmering is so common now that I've almost sorta gotten used to it..

sure it annoys me somedays more than others.. especially some of the roads in WoW which for some reason Shimmer something awful on my Nvidia 7800GTX's in SLI..

I guess honestly there simply isnt enough incentive to "fix it"


Now my question for you is:

Would you have bought a 7800 GTX knowing that there would be shimmering?
 
pandora's box said:
Now my question for you is:

Would you have bought a 7800 GTX knowing that there would be shimmering?


I know I would not have wasted my money on my 7800gtx had I known how bad the I/Q is in certain games. In the game GTR, to get my 7800 to to have the same I/Q as my x800xtpe I had to use nHancer (No setting in the nvidia c/p would take away the shimmering)
So now my brand new 7800gtx runs 20fps BEHIND my x800xtpe for the same I/Q.
I should have known better, I tried a 6800 ultra when they came out and it looked like crap also.
 
pandora's box said:
Now my question for you is:

Would you have bought a 7800 GTX knowing that there would be shimmering?

probably...simply because I game now exclusive on 23-24" WS LCD's so I need as much GPU power as possible to fully use 1920X1200

But you better be sure I'm picking up an R520 to check that out ASAP.
 
I just picked up my 7800 GTX a few days ago, I thought the rumors about 7800 AF being even worse than 6800 AF were just rumors until a few days ago. When I had my 6800, I always played on HQ because of the shimmering. I have given up reading most reviews nowadays, I mean the "Quality" mode is so bad that it looks like the floor is crawling, yet pretty much all reviews used it. It was even worse when the 6800 was first launched, it was only after a few months that nVidia released a driver with the negative LOD clamp and gradually improved the HQ mode. That was enough to silence most of us who complained about the shimmering almost a year ago, but the issue was never completely resolved. The negative LOD clamp was a workaround not a fix.

Anyways, I'm glad it is getting some attention again, hopefully nVidia will completely fix the problem this time.
 
pandora's box said:
Now my question for you is:

Would you have bought a 7800 GTX knowing that there would be shimmering?

Yes but only because there is no alternative. Just goes to show how important competition is in this market! If R520 delivers >15% performance over 7800 GTX and awesome ATi IQ, I'll sell my GTX and get one.
 
5150Joker said:
Yes but only because there is no alternative. Just goes to show how important competition is in this market! If R520 delivers >15% performance over 7800 GTX and awesome ATi IQ, I'll sell my GTX and get one.


thats a comment that comes across as an update freak. Could be a total peice of garbage but the fastest and newest and i bet you'd keep it. nothing wrong with that, just not totally rational thinking. I mean honostly, theres very little out there thats going to make your playing that much less enjoyable without having the latest. And yes, i'll say this when ATI released their new products as well. Personally i'd rather have playable frame rates and the best IQ over the highest bar graph scoring card and reduced IQ, but thats just me.
 
Shifra said:
thats a comment that comes across as an update freak. Could be a total peice of garbage but the fastest and newest and i bet you'd keep it. nothing wrong with that, just not totally rational thinking. I mean honostly, theres very little out there thats going to make your playing that much less enjoyable without having the latest. And yes, i'll say this when ATI released their new products as well. Personally i'd rather have playable frame rates and the best IQ over the highest bar graph scoring card and reduced IQ, but thats just me.

The problem is you're assuming I didn't need the extra graphics power. BF 2 was killing the framerates on my X800XT PE @ 600/600 so the 7800 GTX was definitely needed. It even struggled with CSS @ 1680x1050 (native res. of my monitor) with AA/AF turned up; min FPS was hitting 30-35 qutie frequently and now it never goes below 50.
 
In the list of innovations introduced by NVIDIA, the manufacturer says the 7800 features a more efficient anisotropic filtering. First important point, we have to keep in mind that because of its architecture, the GeForce 7800 GTX´s performance cost will be partially hidden with the activation of a complex filter because of a higher number of pipelines than ROPs.

But this isn’t all. NVIDIA has once again modified anisotropic filtering. It’s hard to tell what the differences are, but clearly something is new. Unfortunately, it sometimes leads to a noticeable reduction in quality in movement. We can clearly see a shimmering effect on some parts of textures, more or less obvious according to the texture level of detail, orientation, or even its level (the first is less impacted when multi-texturing). This shimmering is less noticeable with the GeForce 6800.
.

Further in the article:

This bright new portrait can’t, however, hide the downside found with the new anisotropic filtering which sometimes reduced graphic quality. This shouldn’t happen with a GPU of this calibre and the performance war should respect some limits.

Finally, someone with balls.
http://www.behardware.com/articles/574-5/nvidia-geforce-7800-gtx.html
 
The nvidia control panel has been fairly flakey from what I can tell in terms of turning the Trilinear/AF optimizations. For it to stick in my case 100% of the time I had to switch off the Trilinear/AF optimizations under the global profile, reboot, and then nearly every game will be relatively shimmer free. (from what I can recall, nHancer did manage to get stuff cleaned up without a reboot).

I quantify it as "nearly every game" because certain ground textures do still exhibit the shimmer effect, but in most cases it's a lot more tolerable than with them on.
 
EekTheKat said:
The nvidia control panel has been fairly flakey from what I can tell in terms of turning the Trilinear/AF optimizations. For it to stick in my case 100% of the time I had to switch off the Trilinear/AF optimizations under the global profile, reboot, and then nearly every game will be relatively shimmer free. (from what I can recall, nHancer did manage to get stuff cleaned up without a reboot).

I quantify it as "nearly every game" because certain ground textures do still exhibit the shimmer effect, but in most cases it's a lot more tolerable than with them on.


and how much of a performance hit did you take?
 
yes thats what im trying to get at. disabling all the optimizations in order to get rid of shimmering results in a massive performance hit. essentially you spend 500 bucks on a 7800 gtx, disable the optimizations to get rid of shimmering then end up with performance of a x850 xt or a 6800 ultra.
 
Well I'm sure nV will fix it to most degree, the AF algo has changed, when the nv40's were out they too had this problem and later it was fixed with driver updates.
 
razor1 said:
Well I'm sure nV will fix it to most degree, the AF algo has changed, when the nv40's were out they too had this problem and later it was fixed with driver updates.

check the videos in the original article about the shimmering issue (links in the beyond3d thread) shimmering is present on the 6800 series card.
 
yeah but thats what I don't get, I don't get the shimmering on HQ, it only happens on quality and its very minimal. I mean you really have to look for it. Its not as blatently obvious as in that hallway demo.
 
Probably quite a bit for the nv 40 wasn't it like 15%?

Heh well I'm on a 6800gt SLI rig, and performance has been pretty steady even with all the optimisations off (though not HQ under the sliders).
 
razor1 said:
yeah but thats what I don't get, I don't get the shimmering on HQ, it only happens on quality and its very minimal. I mean you really have to look for it. Its not as blatently obvious as in that hallway demo.
I just noticed it for the first time on a 6600GT in BF2. I guess it's like the rainbow effect on DLP HDTV's...some people see it, some people don't...
 
Clamping LOD Bias, turning off all opts, and forcing triple buffering and trilinear filtering fixed shimmering for me...

And they are all in the nVidia CP.
 
You cant force triple buffering in D3D games with NV's drivers.

Camping, and setting it to HQ, fixed most of it. But I still get shimmering in BF2. Which is VERY annoying. I can live with it, but it happens way too much.
 
razor1 said:
Well that solves it its a bug. And needs more attention by nV.

What solves it? There is no solution yet. The reboot after clearing global doesn't work nor is it a good solution to the problem of the filtering being worse than the 6800.
 
5150Joker said:
What solves it? There is no solution yet. The reboot after clearing global doesn't work nor is it a good solution to the problem of the filtering being worse than the 6800.


he means the article i linked to on the previous page proves its a bug.
 
5150Joker said:
What solves it? There is no solution yet. The reboot after clearing global doesn't work nor is it a good solution to the problem of the filtering being worse than the 6800.


Did you try nhancer? Seems to work better through that, just tried it and the shimmering is very minmal. But then again I wasn't really getting it before either. HQ I don't get any shimmering at all with a 6800.

BTW how do I encode with xvid codec, I have some videos of UT04

I will do some more videos when my 7800 comes in too.
 
It is not a bug in the CP, there is no question the "Quality" Mode shimmers a lot whether you use nHancer, Rivatuner, whatever, it is more noticeable in some games than others. On a 6800 the HQ completely eliminates the shimmering in most games (including UT2004), but there is the odd game that does shimmer a lot regardless on nVidia cards. On the 7800, there is just a bit of shimmering even on HQ but not nearly as bad as "Quality" on either card. I just hope that ATi doesn't implement the same type of "optimization" in their next gen card. It seems as though whenever one company comprises AF, the other follows suit, nVidia got a brilinear mode ATi got a brilinear mode, ATi had angle dependant AF nVidia followed suit.
 
razor1 said:
Probably quite a bit for the nv 40 wasn't it like 15%?

If I remember well setting HQ in the CP the framerates went from about 50avg to 35, which is a whopping +25% decrease. (in Joint Ops) But don't pin me on that one, i don't have a Nv40 anymore.
With the default Q the 6800U was more or less able to stay in the regions of a X800XTPE, but switching to HQ the 6800 gets totally slaughtered. Exit 6800 for me.

This is a neat point for people who are picky with IQ, they think to buy a fast card, and yes, it's fast in the default Q mode, but if you want the same IQ as its competitor you'll have to switch to HQ...
 
Own a 6800GT and have never noticed shimmering in any game....but then again im the type of person that doesnt get his knickers in a twist when a couple of textures dont look right or out of place. I am happy as long as my card can run at 1600x1200 with details on high and no AA/AF. Then again im not the type of person who smacks down $500 for a new card expecting it to only last me 6 - 12 months
 
Apple740 said:
If I remember well setting HQ in the CP the framerates went from about 50avg to 35, which is a whopping +25% decrease. (in Joint Ops) But don't pin me on that one, i don't have a Nv40 anymore.
With the default Q the 6800U was more or less able to stay in the regions of a X800XTPE, but switching to HQ the 6800 gets totally slaughtered. Exit 6800 for me.

This is a neat point for people who are picky with IQ, they think to buy a fast card, and yes, it's fast in the default Q mode, but if you want the same IQ as its competitor you'll have to switch to HQ...

I think it should be possible for nVidia to fix the shimmering issue in Quality mode without most of the performance hit. Here is my reasoning, if you disable all the optimizations in Quality mode you get almost as big a performance hit as High Quality. However, using "Quality" with all the optimizations disabled does not reduce the shimmering. So most of the performance hit going from Quality to High Quality is not related to the texture aliasing/shimmering, although some of it is.

Quality with all the optimizations disabled is not the same as high quality, there is definitely another optimization that we can't control with anything other than with the Image Quality slider. If nVidia would let us enable the optimizations in High Quality mode (which you currently can't) and reduce the shimmering slightly on the 7800, the issue would be resolved. Well for most games anyways, since there are a small number of games that shimmer on nVidia cards even on High Quality. There would still be a performance hit, just not nearly as big if they would allow us to enable some optimizations in High Quality mode.
 
The only review to warn on the shimmering in the 7800 was http://www.behardware.com/articles/574-5/nvidia-geforce-7800-gtx.html

The regular shimmering on NVIDIA cards to me looks to be caused by aggressive LOD switching occuring. In ALL modes - quality and high quality, LOD bias clamp on or off - everything

Take a simple mipmap viewer (http://3dcenter.org/downloads/d3d-af-tester.php for D3D - they have another one for OGL on that site) and increment the LOD level. At 0.2 in OGL and 0.4 in D3D all of a sudden NVIDIA starts switching to a lower level mipmap on the entire screen (when AF is selected). In moving images this typically causes a pop as lower quality textures are forced on.
 
Back
Top