More shots of X800's brilinear filter in action & IQ degradation

This is really blown out of proportion. This is NOT the end of the world. Honestly, it's so unimportant that it borders on being a non-issue completely.

I don't see why a developer shouldn't include the option for full trilinear filtering but if they don't it's definitely not reason to say their cards suck or whatever.

$100 says nobody here could correctly pick out the trilinear images in a series of 20 brilinear vs. trilinear images.

The latest craze seems to be about nothing but IQ and horrible framerates. I guess that's nice if you want a digital art gallery and not much more but some people actually want to play the games. In a not-so-strange correlation that I have noticed, the people who usually brag the most about their high image quality are usually the worst gamers out.
 
creedAMD said:
My works done here. Have fun with the thread.

ok, I'd just like to request that next time you keep things on a technical level instead of resorting to attempting to attack my personal credibility.

I was done with this thread a long time ago but people kept bumping it :)
 
Most people as far as i can see will see it as just more blatant nitpicking :p You are making the assumption that there is rampant IQ degradation in the X800 lineup :rolleyes: Which i am shore most reviews sites whould've tore ATI a new one if it was as Drastic as you are trying to make it sound :p
All i can see is across the board good reviews of both cards an not a word about this supposed IQ degradation?
 
kick@ss said:
This is really blown out of proportion. This is NOT the end of the world. Honestly, it's so unimportant that it borders on being a non-issue completely.

Some people seemed to make this thread out into a huge attack on ATI for some reason which it was not.

I don't see why a developer shouldn't include the option for full trilinear filtering but if they don't it's definitely not reason to say their cards suck or whatever.

I agree completely.

$100 says nobody here could correctly pick out the trilinear images in a series of 20 brilinear vs. trilinear images.

I dunno about that, because bri shots do have less blending and you can usually tell they are sharper near the mips when analyzed - assuming you told people what the differences between bri and tri are. What you could do is ask people which they like better and then I bet you would get mixed responses, namely because some people even prefer the look of bilinear to trilinear, nevermind brilinear - due to the sharper textures.

The latest craze seems to be about nothing but IQ and horrible framerates. I guess that's nice if you want a digital art gallery and not much more but some people actually want to play the games. In a not-so-strange correlation that I have noticed, the people who usually brag the most about their high image quality are usually the worst gamers out.

Well in this case, the card in question (x800) has fps to spare for eye candy. In the picture this thread started with, the user was getting 150fps with full trilinear and no opts. Since the card is so fast you aren't losing anything in terms of fps.
 
@trapine said:
Most people as far as i can see will see it as just more blatant nitpicking :p You are making the assumption that there is rampant IQ degradation in the X800 lineup :rolleyes: Which i am shore most reviews sites whould've tore ATI a new one if it was as Drastic as you are trying to make it sound :p
All i can see is across the board good reviews of both cards an not a word about this supposed IQ degradation?

Both cards have excellent IQ. The x800 just isn't meeting its full potential in terms of filtering, when it is perfectly capable of full trilinear and has the fps to spare to use it.
 
You know some of you folks are trying to keep it related to the topic and even avoiding attempts to let it degrade into another attack thread. To this and these folks I truely say congrats, that's how it should be done please continue to be the example for how threads should be handled. To some of the others, and you should know who you are, I restate this fact. If you do not agree with what is being posted then by all means provide data, arguments, or information to counter what gets posted. If all you have to offer is "that's bullshit" or "go away" you are Thread Crapping. If you don't like the topic or what you see then close the thread, NOBODY forces you to read or reply. However you will accept the responsibility for your posting(s) in this forum. Using terms like "Fanboy", "Nvidiot", etc does nothing but cause problems and if you're posting to cause problems that has no place here. Keep it to the tech and not each other. This isn't personal, don't make it that way.

Now on a side note, I purchased an x800XT-PE when they became available at CompUSA in Mesquite. I paid full retail price of 499.00 for that card. Now if you happen to be someone who feels my purchasing dollars may have been better spent on another product and my vid card isn't worth it's money because of a competing product and whatever IQ, filtering, etc you could be right, you could be wrong, but I do know this. It's definately faster than my last card. I can game at higher resolution with increased AA /AF turned on and I'm enjoying a better gaming experience. I may or may not have THE best card out there but guess what, I'm satisfied. :)
 
CIWS said:
Now on a side note, I purchased an x800XT-PE when they became available at CompUSA in Mesquite. I paid full retail price of 499.00 for that card. Now if you happen to be someone who feels my purchasing dollars may have been better spent on another product and my vid card isn't worth it's money because of a competing product and whatever IQ, filtering, etc you could be right, you could be wrong, but I do know this. It's definately faster than my last card. I can game at higher resolution with increased AA /AF turned on and I'm enjoying a better gaming experience. I may or may not have THE best card out there but guess what, I'm satisfied. :)

I think "the best card out there" this generation is much more a matter of personal preference than any other generation we've seen, which is why minutia is being analyzed.
 
The x800 just isn't meeting its full potential in terms of filtering

:eek: Well if this is the case and most reviews i have read have put the 6800 and X800's IQ at a dead heat.Then when ATI pull there Optimizations by all rights IQ should again be better than 6800?Thats all i can gleen from that statement? :confused:
 
@trapine said:
:eek: Well if this is the case and most reviews i have read have put the 6800 and X800's IQ at a dead heat.Then when ATI pull there Optimizations by all rights IQ should again be better than 6800?Thats all i can gleen from that statement? :confused:

Nah, it's just that Brilinear is a really good optimization, but there are cases where it could look even better, like in the case that started thius thread where I felt the mip transitions could be less harsh in the bri shot. Would I notice it in game? Probably not. Therefore in general, I'd say both offer comparative IQ if you aren't picky, agreeing with the review sites.

Obviously a full trilinear filter by ATI isn't going to look any better than a full trilinear filter by Nvidia, since they are virtually the same thing.

In the end, its like this right now:

x800 offers:
bilinear, brilinear

6800 offers:
bilinear, brilinear, full trilinear

Though full trilinear isn't always useful, it is a nice option to have for maximum blending, and there is no reason the x800 couldn't have it. And I hope future cards by both NV and ATI have it, too.
 
tranCendenZ said:
I think "the best card out there" this generation is much more a matter of personal preference than any other generation we've seen, which is why minutia is being analyzed.


Going by past vid card releases and these "competitons" both companies will likely continue to improve on driver sets and fix and issues they can. So as drivers mature we will certainly see some of this change. But I will be brutally honest here. The one game I'm really giving a crap about performance wise is Doom 3, which unfortunately isn't out yet. When it does finally get released I will get to find out how my current hardware does with it and if I have jumped the gun on a high dollar purchase. If I did, that's my fault, but I still can't believe that it will suck. If it does I will probably purchase a different product based on the review data that will be out shortly upon Doom's release. However Doom 3 and more specifically it's gaming engine will spawn a whole host of games for years to come and this is where my biggest concern lies.
 
CIWS said:
Going by past vid card releases and these "competitons" both companies will likely continue to improve on driver sets and fix and issues they can. So as drivers mature we will certainly see some of this change. But I will be brutally honest here. The one game I'm really giving a crap about performance wise is Doom 3, which unfortunately isn't out yet. When it does finally get released I will get to find out how my current hardware does with it and if I have jumped the gun on a high dollar purchase. If I did, that's my fault, but I still can't believe that it will suck. If it does I will probably purchase a different product based on the review data that will be out shortly upon Doom's release. However Doom 3 and more specifically it's gaming engine will spawn a whole host of games for years to come and this is where my biggest concern lies.

I am actually hugely looking forward to Doom3 as well, far more than Half Life 2 or any other game. I long for the days of creepy dark fps games and deathmatch that wasn't just 32 people blowing up whoever comes in front of them first. Doom3 sounds like it will bring the hunting aspect back to fps multiplayer deathmatch.

This is part of the reason I went with the 6800. The x800 series obviously isn't going to suck in Doom3 due to its raw horsepower, but I think the 6800 will have the edge. Mainly because of Nvidia's excellent OpenGL performance in general, combined with the fact that Doom3 is going to use exclusive features on NV hardware such as Ultrashadow II for more efficient shadow rendering, and I assume fp16pp to speed up shaders when fp32 is not necessary. I have a feeling that NV will have the edge with Doom3 because of this. Not sure how big that edge will be tho.
 
I think doom3 will be better on the 6800 as well, it's one of the factors I took in purchasing the 6800gt. (with someone elses cash) :)
 
Sorry to sort of Thread jack tranCendenZ?But it will be intresting to see how soon games start to trickle out utilizing the Doom3 engine?But its definatly Going to be an intresting last couple of months of the year where Graphics are concerned :eek: :D
 
@trapine said:
Sorry to sort of Thread jack tranCendenZ?But it will be intresting to see how soon games start to trickle out utilizing the Doom3 engine?But its definatly Going to be an intresting last couple of months of the year where Graphics are concerned :eek: :D

This is the way I see it re: engines:

Doom3 engine will have the edge on NV hardware (performance)
HL2 engine will have the edge on ATI hardware (performance & IQ)
FarCry engine will have the edge on NV hardware (IQ once its all done and patched up, performance potential)

But yeh I think there will be some excellent games coming out this fall.
 
Most people are looking foward to Doom III.

Has a date been confirmed yet for HL2?
 
tranCendenZ said:
This is part of the reason I went with the 6800. The x800 series obviously isn't going to suck in Doom3 due to its raw horsepower, but I think the 6800 will have the edge. Mainly because of Nvidia's excellent OpenGL performance in general, combined with the fact that Doom3 is going to use exclusive features on NV hardware such as Ultrashadow II for more efficient shadow rendering, and I assume fp16pp to speed up shaders when fp32 is not necessary. I have a feeling that NV will have the edge with Doom3 because of this. Not sure how big that edge will be tho.

At the moment I bought my card Nvidia's newest products were not yet available, so the only choice I had was to wait. I don't always do the "wait" thing real good (just ask Kyle).
<personal opinion> Because of the Quakecon 2004 announcement of a Doom 3 tourney sponsored by Nvidia and paying big bucks, I'm betting on the fact ID Software feels the Nvidia product was the right one to premiere with it's new game, so the OpenGL is probably better there and with Doom 3. </personal opinion> But as with all things we won't know for sure until the game gets out and everyone in the review business get a chance to provide us some real data.
 
actually things aint looking too good for ATI at the moment.No X800XT parts in mass,And possibly the biggest Game launch of god knows how long :eek:And lots of 6800GT parts starting to flood the market :eek: I can feel some dark days acommin At tha old Rage3d Barn! :eek: :p LMAO
 
Personally, I play all my games at 1024x768x32 @ 2X FSAA and 8X AF, because at higher resolutions I can't run 85Hz refresh and it hurts my eyes. And my 9800 PRO does this fine.

I don't see why all of the sudden in the last 6 months, everyone HAS to be running 1600x1200x32 2-4X FSAA and 8-16x AF, is it really that huge of a difference?

I guess I need to get a new monitor, eh?
 
Rizen said:
Personally, I play all my games at 1024x768x32 @ 2X FSAA and 8X AF, because at higher resolutions I can't run 85Hz refresh and it hurts my eyes. And my 9800 PRO does this fine.

I don't see why all of the sudden in the last 6 months, everyone HAS to be running 1600x1200x32 2-4X FSAA and 8-16x AF, is it really that huge of a difference?

I guess I need to get a new monitor, eh?

Or you could just use 8xS if you get a new vidcard... Sweet hybrid SS AA mode that gets you a lot of the texture detail the higher resolutions do.

Check out the difference at 1024x768 with call of duty with regular 4x multisampling versus 8xS (8xS is mouseover)
http://www.reflectonreality.com/images/nv40/1024768/cod.html
(trilinear, 8xAF, 61.36 forceware, 6800gt)

I tend to play CoD at 1280x1024 with 8xS because IMO the textures look better with less aliasing than 1600x1200 at 4x rgms.
 
Lord of Shadows said:
lol nvidia got slammed for IQ tweaks, and whenever ati does it no one wants to hear.

Mipmap blending never really bothered me, but that doesnt mean I want to get lied to.
Cheat the cheaters!

Unless it's ATI...
 
Off topic: Hey, I'd love for someone to show a colour enhanced difference between 16xaf and 32xaf to see if its worth using. I'm sure that using 8x to 16x AF if using the same visual colour difference would be a lot more prominent than the seemingly tiny visual differences in ATi's optimized and textbook trilinear.

PS: There is one map in UT2004 where I can see visual thresholds, and that is on AS-Robotfactory on the floor grates just before you go into destroy the AI generator. Funny thing is, that it is visible even on a Radeon 9700pro which is supposed to do textbook trilinear (possible maptexture or lighting bug?)
 
tranCendenZ said:
I actually changed my mind long before NV40 came out, and posted in support of options in numerous forums late last year and early this year. But yeh at first I had the same reaction as many ATI users have now: "Who cares if I can't see it?" Then I realized I was the one who was losing out on graphic options and changed my tune, because I wanted the option of full trilinear.

Is this really what it comes down to? Since you have no facts to debate me with you have to attempt to attack my personal credibility? I'm sure.. No, I know I could dig up lots of lovely stuff about your "bias," but I'm not going to stoop to that level.

What you are missing "ruined" is that:

A) there is nothing wrong with the opts ATI is using.
B) nothing has been submitted to support your theory...
C) ATI has been using this very optimization that is being called into to question for TWO YEARS.

Yes, given enough time anyone can "locate a flaw" in the system. You are trying very hard to paint a picture claiming that there any relevance between nVidias opts and ATI's....

There isnt.

Had you simply stated.....

"ATI should also have the option in their CP to force full trilinear."

...and left it at that, you would have been creditable......

You clearly are not.....
 
Blackwind said:
What you are missing "ruined" is that:

You like calling me ruined better? lol My handle here is not ruined, its tranCendenZ, you can see that on the left. its much [H]arder than ruined :) Or is this just another attempt to stab at my credibility as creed was doing? :rolleyes:

A) there is nothing wrong with the opts ATI is using.
Agreed.

B) nothing has been submitted to support your theory...
On the contrary, several sites and individuals have pointed both out theoretical and real life shortcomings of brilinear filtering. You can find those links in the previous pages of this thread.

C) ATI has been using this very optimization that is being called into to question for TWO YEARS.

Only on their middle of the line card (9600) that never had its IQ scrutinized on a regular basis like the 9800XT which was full trilinear. Again, even with Nvidia's nv3x optimization that was harsher than the nv40/x800 optimization, people were not sensitized to it until a website (3dcenter.de) found it and pointed it out.

Yes, given enough time anyone can "locate a flaw" in the system. You are trying very hard to paint a picture claiming that there any relevance between nVidias opts and ATI's....

? I'm trying to paint a picture that all users should have the option of full trilinear filtering.

Had you simply stated.....

"ATI should also have the option in their CP to force full trilinear."

I did state this.

...and left it at that, you would have been creditable......

And you think no one would have asked: "Why is that tranCendenZ?" Since when can you make a statement such as that and have no reason or evidence whatsoever to back it up. I'm just going to start a thread saying "btw, ati should have trilinear in their cp, no particular reason, kthxbye!" Doesn't sound realistic to me.

You clearly are not.....

Perhaps in your eyes. I presented evidence from numerous sources that indicates that full trilinear should be made an option in addition. Don't buy into the hype that "everything is the same." More options are always better, and brilinear is doing less blending than full trilinear. Those are the facts, and if you scroll back you can see plenty of evidence to back it up.

Based on your statements to "counter" me it also looks like you haven't actually read what I posted, and instead only read what people who have attacked me have posted. Read through what I've posted from the beginning of the thread, because I've already agreed with several of the points you've used to "counter" me with earlier in the thread.
 
tranCendenZ said:
Nah, it's just that Brilinear is a really good optimization, but there are cases where it could look even better, like in the case that started thius thread where I felt the mip transitions could be less harsh in the bri shot. Would I notice it in game? Probably not. Therefore in general, I'd say both offer comparative IQ if you aren't picky, agreeing with the review sites.

Obviously a full trilinear filter by ATI isn't going to look any better than a full trilinear filter by Nvidia, since they are virtually the same thing.

In the end, its like this right now:

x800 offers:
bilinear, brilinear

6800 offers:
bilinear, brilinear, full trilinear

Though full trilinear isn't always useful, it is a nice option to have for maximum blending, and there is no reason the x800 couldn't have it. And I hope future cards by both NV and ATI have it, too.
atis and nvidias filtering methods are not the same thing, they are different chips, could they both look good, similar yes, but they are accomplished in different ways, since they use different algorithms
could you say that nvidias brilinear was as good as ati in the 59xx series cards? i really doubt it.
the general consensus on beyond3d on that 60 plus tread was that users couldnt tell the difference, so it is a good algorithm, is it full trilinear, no. can the option be added on the control panel yes, or so said ati on their chat. and that many people were really mad they where not told of it. will that influence my buying decision, not really.

i think that the bottom line is that both cards are really great, my options so far are either a gt or xt. no sense in getting an ultra.
 
Have you guys actually read what I have posted so far or are you just going on the assumption that I am making some big attack on ATI? I think you should read through the thread if you haven't; the "big ati attack" was fabricated by other users.

Because half the stuff you are saying to "counter" me with I agree with, it seems like you don't know what I've actually posted in the thread so far.

Apparently some users thought it would be better to turn this thread into a flamewar instead of discussing the technical differences between brilinear and trilinear. That's a shame, but if you read through the thread you can cull some good info and find where I actually stand (as opposed to what certain individuals accused me of).
 
tranCendenZ said:
You like calling me ruined better? lol My handle here is not ruined, its tranCendenZ, you can see that on the left. its much [H]arder than ruined :) Or is this just another attempt to stab at my credibility as creed was doing? :rolleyes:

It is not an attempt. It is pointing out very clearly you are not someone that can be counted as a creditable source...


tranCendenZ said:
On the contrary, several sites and individuals have pointed both out theoretical and real life shortcomings of brilinear filtering. You can find those links in the previous pages of this thread.

Theories are not fact until proven .....nothing you have pointed out clearly supports these theories. They merely attempt to point out flaws in a older game that could easily be contributed to nothing involving ATI's optimizationm what so ever. This could be due to anything from drivers to the game itself on new hardware. We have seen this occur time and time again in the past and will continue to see these type of events in the future.


tranCendenZ said:
Only on their middle of the line card (9600) that never had its IQ scrutinized on a regular basis like the 9800XT which was full trilinear. Again, even with Nvidia's nv3x optimization that was harsher than the nv40/x800 optimization, people were not sensitized to it until a website (3dcenter.de) found it and pointed it out.

These cards were reviewed no less then then the major offerings. This scrutinized effort is nothing short of biased attempts at proving an "issue" and "problem" with ATI's opts. When neither reviews or consumers notice an lack in IQ for over two years I would call that very good opts.


Not a problem...



tranCendenZ said:
And you think no one would have asked: "Why is that tranCendenZ?" Since when can you make a statement such as that and have no reason or evidence whatsoever to back it up. I'm just going to start a thread saying "btw, ati should have trilinear in their cp, no particular reason, kthxbye!" Doesn't sound realistic to me.

Oh ..I'm sure they would have questioned. You made your bed, sleep in it....



tranCendenZ said:
Perhaps in your eyes. I presented evidence from numerous sources that indicates that full trilinear should be made an option in addition. Don't buy into the hype that "everything is the same." More options are always better, and brilinear is doing less blending than full trilinear. Those are the facts, and if you scroll back you can see plenty of evidence to back it up.

Actually no,you didn't backup or prove anything.....no one is disputing that trilinear is technically better then brilinear. No one has disputed that having more options is a good thing. What is being disputed is your claiming that there is an issue with ATI's opts.

Again, there isnt.

Not one shred of evidence has clearly demonstrated that any flaw or issue is clearly a fault within ATI's opts. These incidents can be attributed to many other things. If there was a clear cut case of fault with them you would have every reviewer out there screaming from the rafters. No less loudly then when nvidia was found to be clearly cheating with theirs.

Case closed.
 
Blackwind said:
It is not an attempt. It is pointing out very clearly you are not someone that can be counted as a creditable source...

I hate to break it to you, but I'm just as credible as anyone else who regularly posts in a videocard forum, including yourself. In fact, if you look at my posting history, you can see I am *more consistent* than most on the brilinear issue. I said back then, and I still say now, that its a great optimization because it usually can't be noticed in game, while others back then were screaming that it was the devil. Now others are screaming that its just fine, and I agree. But why not make full trilinear available if the card can handle it?

BTW, did we not agree not to bring personal attacks into this? Flamewars are not useful.

Theories are not fact until proven .....nothing you have pointed out clearly supports these theories. They merely attempt to point out flaws in a older game that could easily be contributed to nothing involving ATI's optimizationm what so ever. This could be due to anything from drivers to the game itself on new hardware. We have seen this occur time and time again in the past and will continue to see these type of events in the future.

Right, but theories + real life gameplay = evidence, which is what we have here, not just with this ut2k3 shot, but other games as well, such as FarCry. There are synthetic tests showing the difference as well. You can look back in the thread for more of that info,


These cards were reviewed no less then then the major offerings. This scrutinized effort is nothing short of biased attempts at proving an "issue" and "problem" with ATI's opts. When neither reviews or consumers notice an lack in IQ for over two years I would call that very good opts.

Well not having full trilinear is an issue in itself, regardless of the quality of ATI's opts. A bilinear/trilinear mix will never provide as much blending as full trilinear, simply because statistically its not.

Oh ..I'm sure they would have questioned. You made your bed, sleep in it....

Right, so I provided the evidence to start with rather than afterwards. That is the smart thing to do. I'm not afraid of controversary.


Actually no,you didn't backup or prove anything.....no one is disputing that trilinear is technically better then brilinear. No one has disputed that having more options is a good thing. What is being disputed is your claiming that there is an issue with ATI's opts.

Not one shred of evidence has clearly demonstrated that any flaw or issue is clearly a fault within ATI's opts. These incidents can be attributed to many other things. If there was a clear cut case of fault with them you would have every reviewer out there screaming from the rafters. No less loudly then when nvidia was found to be clearly cheating with theirs.

The evidence was already presented several pages ago. It's not worth screaming about because the difference is not as big as it was with nv3x, its much more subtle. The brilinear optimizations now are much better than they were for nv3x. That still doesn't make them a substitute for full trilinear, and that is exactly the issue with ATI's optimizations. I'm choosing not to stick my head in the sand, that's all.

In fact, most of the "screaming" in this thread has been done by people vehemently opposed to debating any difference in brilinear and trilinear filtering.
 
Look....this is a non issue. You guys should stop arguing about this. The way to sort all of this out is to get ATi and NVIDIA to allow for 100% apples to apples comparisons with their drivers. Infact there should be a check mark for it so that it defaults to the apples to apples comparison. This is not hard to do. As far as the opts go they are not important as long as you cannot see the differences when the game is in motion. I DO NOT care about seeing flaws in screenshots because that is not how I PLAY my games. If I cannot tell that opts are going on when I am actually playing my games then there are no problems. For benchmarking though the settings for the cards should be identical. ATi and NVIDIA both should have an option that is actually called Apples to Apples in the CP for testing both cards head to head. The option should force the Clock Speeds of the card to default speeds and change all quality settings so that they match the competition.
 
trudude said:
Look....this is a non issue. You guys should stop arguing about this. The way to sort all of this out is to get ATi and NVIDIA to allow for 100% apples to apples comparisons with their drivers. Infact there should be a check mark for it so that it defaults to the apples to apples comparison. This is not hard to do. As far as the opts go they are not important as long as you cannot see the differences when the game is in motion. I DO NOT care about seeing flaws in screenshots because that is not how I PLAY my games. If I cannot tell that opts are going on when I am actually playing my games then there are no problems. For benchmarking though the settings for the cards should be identical. ATi and NVIDIA both should have an option that is actually called Apples to Apples in the CP for testing both cards head to head. The option should force the Clock Speeds of the card to default speeds and change all quality settings so that they match the competition.

but that really is never going to happen, so yes lets just stop this thread
 
As far as optimizations go, this one isn't all that bad, at least certainly not bad enough to have a 4 page thread about. ;)

50 more fps, for that minor change. If I'm understanding this correctly, that's a pretty sweet deal. I can live with a small loss of image quality to make it more playable (maybe you don't need the extra performance at 10x7, but what about 16x12?).
 
SnakEyez187 said:
but that really is never going to happen, so yes lets just stop this thread

No one is forcing you to either read or post in this thread or any other on this forum. If you do not like reading it anymore then use your browser, close it and your problem is solved. Making a post in it only bumps it back up to the top and allows someone to respond to your post as well.
 
Lord of Shadows said:
lol nvidia got slammed for IQ tweaks, and whenever ati does it no one wants to hear.

Mipmap blending never really bothered me, but that doesnt mean I want to get lied to.
ditto
 
Troll :rolleyes:
When nvidia disabled optimzations by DEFAULT, your ati IQ shirades will have merit, untill then, you're just starting a flame war.
The default settings are pretty horrible IMO(i run a 6600GT) even compared to my 8500, the texture aliasing was pretty bad in far cry, with settings at 4Xaa and 16xAF.
When I put the driver into HQ mode, it still wasn't that good.
How could you possibly say the X800 pro is incredibally past? Trying to make it seem like it is, so ati is doing uneeded "optimzations"?
If you really want to to make a case for your self, find a video of an X800 in action in a game, and make a video of your 6800GT in the same space, with both drivers set to HQ mode(disable all possible optimizations).
If ati didn't care about IQ, they would still be using the AF scheme the 8500 uses ;)
edit- wtf.. stupid bumper...
 
Thats the problem with ruineds threads! They tend to hang around like a bad fart knocking everyone over the head :eek: LMFAO!
To me Having both an X800 and 6800 i will be buggered if i can tell the diffrence? Apart from the drivers. where the (Poorware) drivers still leave a lot to be desired :p
 
glynn said:
Thats the problem with ruineds threads! They tend to hang around like a bad fart knocking everyone over the head :eek: LMFAO!
To me Having both an X800 and 6800 i will be buggered if i can tell the diffrence? Apart from the drivers. where the (Poorware) drivers still leave a lot to be desired :p
I don't see a problem with the forceware drivers, cept the lack of WHQL drivers compared to ati.
Does CCC have the option to change AA and AF via taskbar icon? I never had a change to use them since I only used an 8500, cuz that's cool, but what's stupid with nvidia's task bar icon, is that you have to set the res THEN the refresh rate, instead of doing that when you're setting the res.
Oh and the latest betas on nvidias site.. the quado ones, i downloaded the non whql(wasnt sure if the other file, the one called quadro, was only for quadros), they have major artifacting in live for speed with my 6600GT.
 
Back
Top