Atomic investigates NVIDIA’s long-ignored accusation": ATi's FP16 demotion usage.

evolucion8

Gawd
Joined
Mar 15, 2006
Messages
917
Source: http://www.atomicmpc.com.au/Feature...s-and-degrading-game-quality-says-nvidia.aspx

Atomic investigates NVIDIA’s long-ignored accusation: Is ATI using FP16 demotion to artificially inflate benchmark scores?

"Demotion doesn't affect all games in the same way
By this point we've tested four games, two that have up to 17 per cent performance increases, two with barely one per cent. Far Cry had a framerate that was so high that the performance was imperceptible between the two settings; however, Dawn of War II sat well below the 60Hz refresh rate of the monitor, and we noticed some clear differences that FP16 Demotion caused.
First of all was the benefit that the faster framerate gave - it made the game 'feel' much faster and less stuttery with both cards, seemingly both more capable of dealing with the workload presented them. It also appeared to reduce the amount of plant and tree pop-in, with foliage on the ground smoothly appearing as the camera panned across the battlefield.

If given the choice between smoother performance at no extra cost and a slight visual change, which would you pick? We had another look at the visual quality of these last two titles for both NVIDIA and ATI hardware, and our results are surprisingly unsurprising."

I don't understand why nVidia is using such cheap PR marketing tactics, while they don't have the performance crown and overall top sales, their lineup is very competitive, but such tactics will taint more their image more than already is.
 
Last edited:
that's OK the die hard nvidia fans will eat this hook like and sinker..........
 
I could care less honestly. What is worrying is the contradiction on Nvidias part and the apparent amount of effort that has gone into this amazing investigation into nothing. They really couldn't have these people improving actual driver performance and fixing bugs instead of chasing around crap like this?
 
I don't see how you can disable CAT AI and say that any differences were *ALL* FP16 Demonition. I'm sure there's plenty of other optimizations in there. We could just as well say they aren't doing full texture LOD, disable CAT AI and find some differences in screenshots and chalk it up to that.

There are plenty of other optimizations they are doing in the CAT AI otherwise they would just do it for all games and not have the application profiles they have. So to say the performance difference between AI on and off is innaccurate.
 
NVIDIA's official driver optimization's policy is to never introduce a performance optimization via .exe detection that alters the application's image quality, however subtle the difference.

BWAHAHAHAHAHAHA!

Hahaha!
Ha... ha

No shit Nvidia? Do you honestly expect people to pay attention to your accusation after you spew THAT bullshit? Like, for serious? I guess they just hoped that people would forget about the whole 3dmark03 cheating saga (don't get all smug ATI - you and your quack 3, er... quake 3 bullshit)

And it wasn't just in the past, they got busted doing it in Crysis as well: http://www.elitebastards.com/?option=com_content&task=view&id=487&Itemid=29

So... yeah, pot calling kettle black
 
So having finished reading the article this time, here are some interesting tidbits:

amd said:
The requirements are that we verify with QE testing that no visible image quality changes occur when it is enabled.

Sounds a lot like Nvidia's policy...

amd said:
In general we do not apply this optimization to DirectX 10 or DirectX 11 titles since the R11G11B10 format is available to developers in that API and titles can therefore already take advantage of it where appropriate. This not exposed to developers in DirectX 9 and this is not an option for DirectX 9 developers, so we do the legwork for them.

So this is done for DX9 games that can't use R11G11B10. Why do we still care about DX9 at all? Far Cry 1 and Oblivion are fairly old now, interesting that Nvidia is bringing this up NOW.

amd said:
In this situation you would have to also disable all NVIDIA optimizations in their driver, something that is made even more difficult by the fact that they do not provide an option to do so in their control panel.

Ouch.

amd said:
Given that in their own documents, NVIDIA indicates that the R11G11B10 format "offers the same dynamic range as FP16 but at half the storage", it would appear to us that our competitor shares the conviction that R11G11B10 is an acceptable alternative.

Nice job contradicting yourself there, Nvidia.

Atomic) Does AMD support the use of NVIDIA's "AMDDemotionHack" in benchmark testing?
AMD) If this is something that will be made available to their end users then, yes. Otherwise this will be a pure benchmarking effort and no end user will benefit from information gathered from these tests. Also, confirming that there is no visible effect on image quality would be vital to confirming the validity of these results.

Sounds like AMD is being reasonable to me - Nvidia, why the hissy fit?
 
Sounds like AMD is being reasonable to me - Nvidia, why the hissy fit?

Sound like their both doing it, so this will probably be a good thing in the end. Even if AMD is forced to change things then Nvidia will have to change things too. Then thing will be made to stop them doing this, and there might be some improvements. Or this will be a big waste of time for everyone.
 
Sound like their both doing it, so this will probably be a good thing in the end. Even if AMD is forced to change things then Nvidia will have to change things too. Then thing will be made to stop them doing this, and there might be some improvements. Or this will be a big waste of time for everyone.

Except they both recommend the change, and it has basically zero IQ impact. The one IQ decrease the test found has nothing to do with HDR, but they mistakenly included it as an IQ decrease from FP16 demoation, which basically means the FP16 demotion resulted in performance improvement with no impact to IQ.

Sounds like a win-win to me (and again, this is for old as shit DX9 games anyway)

What will happen is ATI won't change what they are doing (and why would they? looks good to me), Nvidia will soon roll it out and make some ridiculous performance improvement claim and continue to not give you the option to disable the optimizations.
 
Nvidia didn't really "accuse" anyone. This isn't meant for the public; it is part of their Reviewer's Guide since GTX 480. Someone 'leaked' this so as to start forum conflicts over basically nothing.

All this guide does is alert reviewers who are still using these old DX9 games, that there are some slight IQ differences between Cat AI enabled or not. Then they give the reviewers a choice to also do the exact same thing in a couple of games by hacking the GeForce 260 drivers.

In the latest GTS 450 Benchmark guide by Nvidia, they officially have issues with these few following DX9 games:

* Dawn of War 2
* Empire Total War
* NFS: Shift
* Oblivion
* Serious Sam II
* Far Cry

The reviewer ultimately makes the choice. i don't use any of these old benches anyway and it is a non issue for 90+% of reviewers.
 
This is an excellent article, and really allows both sides of the debate to come through clearly.

I think I'd rather have the optimizations built into the driver with the option to turn them off. There was an outside cases where AMD did what they claimed they would not do: reduced image quality in FarCry using 11_11_10. But in most cases, they live up to their word, so the optimizations are a good thing.
 
Dear nVidia please concentrate on making good hardware instead of making yourself look bad by constantly pointing out things your competitors do while you are doing them at the same time. lol
 
The reviewer ultimately makes the choice. i don't use any of these old benches anyway and it is a non issue for 90+% of reviewers.

A couple sites still use DoW2 and NFS:Shift but the rest are long dead and buried. Don't know why Atomic decided to highlight this now though. It's been out there a long time.
 
BWAHAHAHAHAHAHA!

Hahaha!
Ha... ha

No shit Nvidia? Do you honestly expect people to pay attention to your accusation after you spew THAT bullshit? Like, for serious? I guess they just hoped that people would forget about the whole 3dmark03 cheating saga (don't get all smug ATI - you and your quack 3, er... quake 3 bullshit)

And it wasn't just in the past, they got busted doing it in Crysis as well: http://www.elitebastards.com/?option=com_content&task=view&id=487&Itemid=29

So... yeah, pot calling kettle black
Wait, isn't optimizing based on the .exe exactly what SLI profiles do? :p
 
Nvidia didn't really "accuse" anyone. This isn't meant for the public; it is part of their Reviewer's Guide since GTX 480. Someone 'leaked' this so as to start forum conflicts over basically nothing.

All this guide does is alert reviewers who are still using these old DX9 games, that there are some slight IQ differences between Cat AI enabled or not. Then they give the reviewers a choice to also do the exact same thing in a couple of games by hacking the GeForce 260 drivers.

In the latest GTS 450 Benchmark guide by Nvidia, they officially have issues with these few following DX9 games:

* Dawn of War 2
* Empire Total War
* NFS: Shift
* Oblivion
* Serious Sam II
* Far Cry

The reviewer ultimately makes the choice. i don't use any of these old benches anyway and it is a non issue for 90+% of reviewers.



it seems like you can't read between the lines in formal/business correspondence

Nvidia said this "NVIDIA's official driver optimization's policy is to never introduce a performance optimization via .exe detection that alters the application's image quality, however subtle the difference.",

basically implying AMD cheats while it is an angel that never cheats.

That is an serious accusation.
 
This should all sound familiar to anyone who was paying attention during the 3DFX vs. Nvidia days, except back then it was 3DFX throwing around the accusations.

If an optimization affects the image quality in a way that can only be discerned by a highly specialized group of graphics cards engineers can discern using special techniques that go beyond human eyes, did it really reduce image quality?

Kind of like if a tree falls in the woods with nobody to hear it...
 
Hey apoppin, long time no see.

Hey man, how is it going? i don't post at the other forum anymore and i am here but rarely.
- you know where to find me, right?

A couple sites still use DoW2 and NFS:Shift but the rest are long dead and buried. Don't know why Atomic decided to highlight this now though. It's been out there a long time.
To stir up controversy and bring traffic to their site? Nvidia's reviewer's guides started to make this differentiation with Fermi GPUs stand out a little more to them, perhaps.
:D

i thought of doing this, but it is such a *nothing* non-issue; i just try to stay aware of new optimizations that may affect IQ. Better, i am investigating the IN-GAME MSAA for Radeons that the devs implemented in a version of Batman: Arkham Asylum that no one seems to be aware of.
:cool:

it seems like you can't read between the lines in formal/business correspondence

Nvidia said this "NVIDIA's official driver optimization's policy is to never introduce a performance optimization via .exe detection that alters the application's image quality, however subtle the difference.",

basically implying AMD cheats while it is an angel that never cheats.

That is an serious accusation.
in your own mind, perhaps.
 
Better, i am investigating the IN-GAME MSAA for Radeons that the devs implemented in a version of Batman: Arkham Asylum that no one seems to be aware of.
:cool:
What? Like Rocksteady made it but didn't put it in the final version or something?
 
I could care less honestly. What is worrying is the contradiction on Nvidias part and the apparent amount of effort that has gone into this amazing investigation into nothing. They really couldn't have these people improving actual driver performance and fixing bugs instead of chasing around crap like this?

:rolleyes: You really think the driver developers are the once coming up with this crap? Its not like they release an updated driver build and then turn into PR managers.
 
Nvidia didn't really "accuse" anyone. This isn't meant for the public; it is part of their Reviewer's Guide since GTX 480. Someone 'leaked' this so as to start forum conflicts over basically nothing.

All this guide does is alert reviewers who are still using these old DX9 games, that there are some slight IQ differences between Cat AI enabled or not. Then they give the reviewers a choice to also do the exact same thing in a couple of games by hacking the GeForce 260 drivers.

But they are trying to get reviewers to disable *ALL* of ATI's optimizations for reviews, which is a dirty tactic. I'm positive we would see slight IQ differences between Nvidia's optimizations on and off - except you *can't* turn off Nvidia's optimizations.

And they did accuse ATI of cheating. Everything from they way they named the FP16 demotation hack ("AMDDemotionHack_ON.exe") to the paragraph explaining it:
AMD has admitted that performance optimizations in their driver alters image quality in the above applications. ... The correct way to benchmark these applications is to disable Catalyst AI in AMD's control panel.

They may not use the word "cheating" anywhere, but they are absolutely saying ATI is cheating.

What? Like Rocksteady made it but didn't put it in the final version or something?

AA support for ATI card IS in the game. Just remove the Nvidia vendor check and you have perfectly standard AA code that runs on any card.
 
:rolleyes: You really think the driver developers are the once coming up with this crap? Its not like they release an updated driver build and then turn into PR managers.

Who do you think is feeding PR information, decoding the other companies drivers and looking at their optimizations and how they are interacting with the environment being displayed, then implementing an exe to duplicate said optimizations on their own hardware? Unless you honestly believe that NV PR is so amazing that they can do the above, I sure as heck don't. I'd prefer any man hours go into improvements of their products, not seemingly trying to bust the other camp for something they do themselves.
 
What? Like Rocksteady made it but didn't put it in the final version or something?

No, there is a FINAL version of Batman: Arkham Asylum (that you can buy today) that supports IN-GAME MSAA for Radeons.
:cool:

But they are trying to get reviewers to disable *ALL* of ATI's optimizations for reviews, which is a dirty tactic. I'm positive we would see slight IQ differences between Nvidia's optimizations on and off - except you *can't* turn off Nvidia's optimizations.
Look, i HAVE a Nvidia GTS 450 reviewers guide and they are *NOT* trying to get reviewers to disable *ALL* of ATI's optimizations for reviews. Just the games on the list - old DX9 games that almost no one benches anymore.

AA support for ATI card IS in the game. Just remove the Nvidia vendor check and you have perfectly standard AA code that runs on any card.
It doesn't run totally right; there are other issues if you spoof the ID.

I mean there is a version of Batman AA where the Dev has implemented MSAA especially for Radeon.
It is in the control panels and you don't have to do anything special but set MSAA in-game for your Radeon.
- you mean no one knows this? There is a big surprise coming soon.
:)
 
Last edited:
No, there is a FINAL version of Batman: Arkham Asylum (that you can buy today) that supports IN-GAME MSAA for Radeons.
:cool:

I mean there is a version of Batman AA where the Dev has implemented MSAA especially for Radeon.
It is in the control panels and you don't have to do anything special but set MSAA in-game for your Radeon.
- you mean no one knows this? There is a big surprise coming soon.
:)
heh...You haven't played the version that was out on release day. Whatever mode you're referring to was not in the initial release. In fact, the settings screen in the version right out of the box calls AA "Nvidia™ Multi Sample Anti Aliasing", as if it were some kind of proprietary technology. Lots of people got upset about this behavior so they removed that wording in the first patch within a few days and just directed users to force it in the ATI control panel.

But Rocksteady/Nvidia didn't care; they got their cheap shot in and bloodied AMD's noses already. The damage had been done in the PR world, and that was all that mattered.
 
Last edited:
heh...You haven't played the version that was out on release day. Whatever mode you're referring to was not in the initial release. In fact, the settings screen in the version right out of the box calls AA "Nvidia™ Multi Sample Anti Aliasing", as if it were some kind of proprietary technology. Lots of people got upset about this behavior so they removed that wording in the first patch within a few days and just directed users to force it in the ATI control panel.

But Rocksteady/Nvidia didn't care; they got their cheap shot in and bloodied AMD's noses already. The damage had been done in the PR world, and that was all that mattered.

No, not in the original release. i have that one also.

There is another official version of Batman: Arkham Asylum that OFFICIALLY *supports MSAA* in game for AMD Radeons
(in the game's control Panel; as implemented by the Dev for AMD, unlike the original verson).
:rolleyes:
 
Haha, you may have to repeat it a couple more times in different colors before it sinks in.
 
Haha, you may have to repeat it a couple more times in different colors before it sinks in.

It's the font size that i was looking for. Now how do you do the giant flashing fonts?
:D

See what happens when reviewers ignore "guides" :p
- NO ONE appears to know about the version of Batman that supports in-game, dev created, MSAA for Radeons

BtW, the performance improvement at 1920x1200 by using Batman's in-game (dev created) 8xMSAA for Radeons, over using 8xAA forced in CCC is 15%. Just by buying the "right" version of the game.
:cool:
 
Haha, you may have to repeat it a couple more times in different colors before it sinks in.
No need to imply that I'm a dense idiot. He and I were simply talking about different things. He even agreed that the original release of Arkham Asylum didn't support AA on Radeons, and I was not aware until today that there was a version of Arkham Asylum released seven months after the original game release that had an option to enable AA on ATI cards. But I'm not going to re-buy a game just so I can get a graphics feature that I could get by spoofing my vendorID last year.

It's irrelevant though; they got their bad PR in for ATI and let the game sit for seven months with no official support for AA outside of "do it yourself in the ATI CCC. We're not going to enable it in the settings".
 
Last edited:
No need to imply that I'm a dense idiot. He and I were talking about different things. He even agreed that the original release of Arkham Asylum didn't support AA on Radeons, and I was not aware until today that there was a version of Arkham Asylum released seven months after the original game release that had an option to enable AA on ATI cards. But I'm not going to re-buy a game just so I can get a graphics feature that I could get by spoofing my vendorID last year.

It's irrelevant though; they got their bad PR in for ATI and let the game sit for seven months with no official support for AA outside of "do it yourself in the ATI CCC. We're not going to enable it in the settings".

No implication was mean by me whatsoever. It surprised the hell out of me - and, believe-it-or-not - the information which i am attempting to confirm - came from a Reviewer's Guide
--- and evidently every single reviewer missed it
:D

If you followed the controversy, Richard Huddy did say a long time ago that the dev might implement a version for AMD cards that sets MSAA in the game's CP
- well it happened .. and it has been out for quite awhile .. and yet all of the benching sites still use the original Batman game which makes AMD cards look much slower than they are at MSAA.
 
ATI has pretty much always had lower image quality and lower performance.

It should be no surprise that they sacrifice image quality to try and keep up.

Their AF implementation is horrible, just horrible. Cat AI also makes things worse in an effort to keep up.
 
both companies have been doing this for a very long time. I think us as users have gotten used to it and forgotten about these optimizations, especially since we got un-compressed AF.

One thing to note is that Nvidia has been very vocal about calling ATI out, not sure what the differences are but since ATI was the first to get called out by the [H] then Nvidia had a bad following after that. So what are the differences between the two optimizations that Nvidia flaunts and makes ATI out to be bad.
 
ATI has pretty much always had lower image quality and lower performance.

It should be no surprise that they sacrifice image quality to try and keep up.

Their AF implementation is horrible, just horrible. Cat AI also makes things worse in an effort to keep up.

That is an insane statement. As of late yeah ATI performance hasn't been stellar but nvidia image quality was ABYSMAL compared to ATI until the 8800 series where they finally caught up. Then there was the whole Crysis image quality thing with nvidia that has been mentioned.
 
ATI has pretty much always had lower image quality and lower performance.

It should be no surprise that they sacrifice image quality to try and keep up.

Their AF implementation is horrible, just horrible. Cat AI also makes things worse in an effort to keep up.

LMAO

I love your sarcasm, you said the entirely the opposite things compare to reality..

ATI's Image quality is far more superior until nVidia finally caught up in game 3 couple years ago, but still falling behind in some part.

back then ATi and nVidia both taking the crown sit from one to another, there is no LOWER PERFORMANCE at all... or if you talking about HD 2000/3000 series?

Also, compare to Catalyst AI, I rather have that similar option for nVidia, so I don't need to keep renaming my application to prevent some failure optimization... :rolleyes:
 
It's the font size that i was looking for. Now how do you do the giant flashing fonts?
:D

See what happens when reviewers ignore "guides" :p
- NO ONE appears to know about the version of Batman that supports in-game, dev created, MSAA for Radeons

BtW, the performance improvement at 1920x1200 by using Batman's in-game (dev created) 8xMSAA for Radeons, over using 8xAA forced in CCC is 15%. Just by buying the "right" version of the game.
:cool:

I spoofed my ATi card with an nVidia card to allow the usage of nVidia's MSAA which is basically Selective Edge Anti Aliasing, and the image quality isn't the same, you can see some jagged edges around, which justifies why it has more performance than regular MSAA which AMD uses through the CCC.

But I'm not going to re-buy a game just so I can get a graphics feature that I could get by spoofing my vendorID last year.

It's irrelevant though; they got their bad PR in for ATI and let the game sit for seven months with no official support for AA outside of "do it yourself in the ATI CCC. We're not going to enable it in the settings".

I just don't see the need to add Anti Aliasing support in the game's control panel. Since day one, I was able to use Anti Aliasing through CCC with Batman's AA. So why to bother add such option on the game's control panel if the CCC option works right, plus the game is very easy in the graphic department, using a similar approach like nVidia's MSAA in game which is basically selective edge anti aliasing, will not give you a huge performance boost to make a difference in playability, heck, is a Unreal 3 engine based game.

ATI has pretty much always had lower image quality and lower performance.

It should be no surprise that they sacrifice image quality to try and keep up.

Their AF implementation is horrible, just horrible. Cat AI also makes things worse in an effort to keep up.

You are insane, get back from under that rock, welcome to 2010!!
 
It's the font size that i was looking for. Now how do you do the giant flashing fonts?
:D

See what happens when reviewers ignore "guides" :p
- NO ONE appears to know about the version of Batman that supports in-game, dev created, MSAA for Radeons

BtW, the performance improvement at 1920x1200 by using Batman's in-game (dev created) 8xMSAA for Radeons, over using 8xAA forced in CCC is 15%. Just by buying the "right" version of the game.
:cool:

Of course the in-game method is faster. Nvidia helpfully put the vendor id check AFTER doing part of the AA work. So ATI cards were stuck doing work that then wouldn't get shown, making them appear slower.

But spit it out already, what is this magical version with in-game AA?
 
Source: http://www.atomicmpc.com.au/Feature...s-and-degrading-game-quality-says-nvidia.aspx

Atomic investigates NVIDIA’s long-ignored accusation: Is ATI using FP16 demotion to artificially inflate benchmark scores?

"Demotion doesn't affect all games in the same way
By this point we've tested four games, two that have up to 17 per cent performance increases, two with barely one per cent. Far Cry had a framerate that was so high that the performance was imperceptible between the two settings; however, Dawn of War II sat well below the 60Hz refresh rate of the monitor, and we noticed some clear differences that FP16 Demotion caused.
First of all was the benefit that the faster framerate gave - it made the game 'feel' much faster and less stuttery with both cards, seemingly both more capable of dealing with the workload presented them. It also appeared to reduce the amount of plant and tree pop-in, with foliage on the ground smoothly appearing as the camera panned across the battlefield.

If given the choice between smoother performance at no extra cost and a slight visual change, which would you pick? We had another look at the visual quality of these last two titles for both NVIDIA and ATI hardware, and our results are surprisingly unsurprising."

I don't understand why nVidia is using such cheap PR marketing tactics, while they don't have the performance crown and overall top sales, their lineup is very competitive, but such tactics will taint more their image more than already is.

Not cheap PR at all.. AMD is starting the IQ hack wars all over again. Do you want each driver you get to be more and more hacked up to compete with the competitors hacked IQ?

Sure, call each subsequent IQ hack "not that big a deal" then add them up and compare the original image to the newly hacked image.. you'll see the difference much more clearly. It's easy to say "hey this doesn't look so bad compared to what I started with". But if there were 4 hacks before that, you don't really have the golden reference image to compare against anymore, do you?

ATI *IS* doing this demotion from higher precision to lower precision. That is a fact. They are changing the image without the user or the developers permission. The developer asked for one format and AMD gave them something lesser to get higher FPS.
 
But they are trying to get reviewers to disable *ALL* of ATI's optimizations for reviews, which is a dirty tactic. I'm positive we would see slight IQ differences between Nvidia's optimizations on and off - except you *can't* turn off Nvidia's optimizations.

And they did accuse ATI of cheating. Everything from they way they named the FP16 demotation hack ("AMDDemotionHack_ON.exe") to the paragraph explaining it:


They may not use the word "cheating" anywhere, but they are absolutely saying ATI is cheating.



AA support for ATI card IS in the game. Just remove the Nvidia vendor check and you have perfectly standard AA code that runs on any card.

That's AMDs fault for not implementing the hacks so they can be turned off individually, or better yet, even default them to off. Why are they lumping a bunch of legitimate optimizations in with a bunch of IQ hacks? That's their fault. They shouldn't be hacking IQ in the first place.
 
ATI *IS* doing this demotion from higher precision to lower precision. That is a fact. They are changing the image without the user or the developers permission. The developer asked for one format and AMD gave them something lesser to get higher FPS.

Did you ever bothered to read the whole article? Far Cry used FP16 demotion, could you pinpoint a difference in HDR image quality? Developers usually targets an audience and certain hardware and sometimes isn't practical in terms of performance because they're not gonna test the game with every card in existence in the world, so is the GPU vendor's work to make sure that the game will run optimal on their hardware, and nVidia does supports it!!!

I don't understand the fuss, when pure FP16 HDR is used, it tends to use a lot of calculations that may be are unnecessary in some scenarios, specially under DX9. AMD stated clearly that there's scenarios where FP16 demotion is used and will not cause image quality degradation (FP16 seems to be overkill for some scenarios), while there's some other places where FP16 demotion will incurr in a image quality drop, hence, AMD will not use it.
 
Last edited:
Back
Top