no AA with HDR... but why?

Status
Not open for further replies.

leSLIe

Fisting is Too Mainstream for Me
Joined
Oct 18, 2004
Messages
13,994
when u are enable HDR in a game, u cannot use AA, i know that, can someone explain to me the reason for that to happen, the "techy" reason. thanx
 
oh!! shame on nVidia :mad: i´m gonna throw all my nVidia hardware right now!
 
Nah, just open up your window and start launching those cards and mobo's out. :p
 
Bona Fide said:
In short, nVidia hasn't yet released a card whose architecture supports enabling HDR+AA. ATI is currently the only manufacturer to have done so.
You almost have it right. nvidia NV4x and G7x cards don't support FP16 HDR + AA, but other forms of HDR + AA like HL2, RTHDRILB, Evolution GT and other games use work. Games that use FP16 HDR are Far Cry, Serious Sam 2 and a couple more.

http://hardforum.com/showthread.php?p=1029262682#post1029262682
 
do we know when nvidia will add this? G80 i assume? anyone have a release date on that.
 
brighton said:
do we know when nvidia will add this? G80 i assume? anyone have a release date on that.

Right now, G80 is looking to release this summer. nVidia will also be releasing their DX10 parts in tandem with the release of Windows Vista, so you may want to hold on for that.

Can someone clarify whether or not G80 will be the DX10-compliant part from nVidia?
 
pxc said:
You almost have it right. nvidia NV4x and G7x cards don't support FP16 HDR + AA, but other forms of HDR + AA like HL2, RTHDRILB, Evolution GT and other games use work. Games that use FP16 HDR are Far Cry, Serious Sam 2 and a couple more.

http://hardforum.com/showthread.php?p=1029262682#post1029262682

And I was just about to ask why HL2 works with HDR + AA while other games don't. ;)

Interesting...
 
Bona Fide said:
Right now, G80 is looking to release this summer. nVidia will also be releasing their DX10 parts in tandem with the release of Windows Vista, so you may want to hold on for that.

Can someone clarify whether or not G80 will be the DX10-compliant part from nVidia?

G80 WILL be D3D10 compliant. Also, D3D10 basically forces HDR+AA to be a part of the spec, so if a game supports HDR on D3D10, it should also allow HDR+AA.
 
Right now there are three games that support HDR+AA only on ATI cards:
Far Cry (two years old +)
Serious Sam2 (seriously stupid)
Oblivion (with unsupported patch)

Right now HDR+AA is in the "tech demo" stage- a couple of companies support it, but the vast majority do not.

The G80 supports HDR+AA, and will be out soon if you believe the rumors. ;)

Anyway, if playing those three games with HDR+AA is a big deal to you, X1800-X1900 are the only game in town now.

I played as much Far Cry as I wanted to two years ago, I finished Silly Sam2 already, and I don't play games like Oblivion, so I can wait a bit for second gen HDR+AA.

I sort of doubt the X1900 has the stones to run upcoming games at HDR+AA at my monitor's 19X12 native res anyway. :(
 
Rollo said:
Right now there are three games that support HDR+AA only on ATI cards:
Far Cry (two years old +)
Serious Sam2 (seriously stupid)
Oblivion (with unsupported patch)

Right now HDR+AA is in the "tech demo" stage- a couple of companies support it, but the vast majority do not.

The G80 supports HDR+AA, and will be out soon if you believe the rumors. ;)

Anyway, if playing those three games with HDR+AA is a big deal to you, X1800-X1900 are the only game in town now.

I played as much Far Cry as I wanted to two years ago, I finished Silly Sam2 already, and I don't play games like Oblivion, so I can wait a bit for second gen HDR+AA.

I sort of doubt the X1900 has the stones to run upcoming games at HDR+AA at my monitor's 19X12 native res anyway. :(

Uh, that's nice. What does this have to do with the topic of this forum?
 
Rollo said:
Right now there are three games that support HDR+AA only on ATI cards:
Far Cry (two years old +)
Serious Sam2 (seriously stupid)
Oblivion (with unsupported patch)

Right now HDR+AA is in the "tech demo" stage- a couple of companies support it, but the vast majority do not.

The G80 supports HDR+AA, and will be out soon if you believe the rumors. ;)

Anyway, if playing those three games with HDR+AA is a big deal to you, X1800-X1900 are the only game in town now.

I played as much Far Cry as I wanted to two years ago, I finished Silly Sam2 already, and I don't play games like Oblivion, so I can wait a bit for second gen HDR+AA.

I sort of doubt the X1900 has the stones to run upcoming games at HDR+AA at my monitor's 19X12 native res anyway. :(

now if i said that the 6800 ultra is too slow to play SM3.0 games since it was the first gen to use sm3.0 you would be all up in arms about it.
 
Coolmanluke said:
Uh, that's nice. What does this have to do with the topic of this forum?

Basically he's saying that HDR+AA is not important untill G80 is out. In other words; don't buy Ati now, but wait for his beloved Nvidia.
 
Rollo, I want to say this nicely, your constantly pro-NV posts are not doin' them any favors, as they come off as fanatic and amateurish, even if you were on this marketing program you still should re-weigh what you say as to have some value.., if u like NV then fine you should support them not start wars in their name!!

Oh before u cry fanATIc, i've had 9600SE, 6600, 6800GT, 7800GTX (current)

For the OP the bottom line is NV hardware can not support FP16 blending with multisampling AA, the ATI X1K can do that, realistically, only the X1900 series have the muscle to pull off HDR+AA with good frame rates
 
Yeah, seriously. That was a huge pile of excuse making BS out of left field. This isn't nvnews.
 
Apple740 said:
Basically he's saying that HDR+AA is not important untill G80 is out. In other words; don't buy Ati now, but wait for his beloved Nvidia.

That's exactly what I'm saying Apple740- that the current implementation of HDR+AA may well be too weak to be a consideration for people buying cards now.

Doesn't matter to me if buyers wait, not like I get a commission, but I haven't seen anything with current HDR+AA that impresses me.

Look at Oblivion speed at 16X12 0X 8X HDR on a FX57/X1900XTX:
http://www.firingsquad.com/hardware/oblivion_high-end_performance/page5.asp

Woot- a whole 27 fps! Can't wait to add some AA to that, maybe get it down to 20fps! Wonder if we could make 15fps at my monitor's native 19X12?

Do you think upcoming HDR games are going to be easier on your card than Oblivion?


Anyway, as the topic is AA+HDR, and we pretty much all know it's a hardware limitation of current nVidia cards have a hardware limitation that prevents them from doing EXR HDR and AA, it would seem to me the only real thing to talk about in this thread is the current state of HDR+AA.
 
pandora's box said:
now if i said that the 6800 ultra is too slow to play SM3.0 games since it was the first gen to use sm3.0 you would be all up in arms about it.

Of course the difference there would be it's a two year old card- and it was plenty fast for the SM3 games that came out within a year of it's launch.

Are there some SM3 games the 7900GTX can't run at adequate fps?
 
Rollo said:
That's exactly what I'm saying Apple740- that the current implementation of HDR+AA may well be too weak to be a consideration for people buying cards now.

Doesn't matter to me if buyers wait, not like I get a commission, but I haven't seen anything with current HDR+AA that impresses me.

Look at Oblivion speed at 16X12 0X 8X HDR on a FX57/X1900XTX:
http://www.firingsquad.com/hardware/oblivion_high-end_performance/page5.asp

Woot- a whole 27 fps! Can't wait to add some AA to that, maybe get it down to 20fps! Wonder if we could make 15fps at my monitor's native 19X12?

Do you think upcoming HDR games are going to be easier on your card than Oblivion?


Anyway, as the topic is AA+HDR, and we pretty much all know it's a hardware limitation of current nVidia cards have a hardware limitation that prevents them from doing EXR HDR and AA, it would seem to me the only real thing to talk about in this thread is the current state of HDR+AA.

LMFAO, wtf is this. My system runs Obby at the highest, HDR + 6xaa + 16af just FINE at 1680 X 1050. It's mostly at 45++ FPS outside, 100+ indoors. That's bullshit, rofl. I should know cuz I play it on my computer and my friend with a similar rig with a 3800+ runs it so smoothly. Where'd they get 27 FPS? GEEEEE, UMM.... K
 
Mr. Stryker said:
LMFAO, wtf is this. My system runs Obby at the highest, HDR + 6xaa + 16af just FINE at 1680 X 1050. It's mostly at 45++ FPS outside, 100+ indoors. That's bullshit, rofl. I should know cuz I play it on my computer and my friend with a similar rig with a 3800+ runs it so smoothly. Where'd they get 27 FPS? GEEEEE, UMM.... K

Hmmm, is that at 8X6 or 10X7?

Seriously, I think the Foliage demo is like a "worst case" scenario- but my point is the same:

http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4
Hmmm, a whole 32 fps at Anandtech at 12X10 HDR without AA on the Oblivions Gate demo. Looks like two websites agree with me that HDR+AA is out of range in "worst case" scenarios at Oblivion.

Looks like HardOCP agrees with me too, even with the settings turned way down, they got 36fps with HDR at my monitors native resolution?
http://enthusiast.hardocp.com/article.html?art=MTAzMywxNCwsaGVudGh1c2lhc3Q=

Of course, it's not too surprising given that they selected 12X10 as their maximum playable setting:
http://enthusiast.hardocp.com/article.html?art=MTAzMyw3LCxoZW50aHVzaWFzdA==

Kind of strange how all these big review sites get so much lower fps than you, Mr. Stryker- maybe you should collaborate with them and help them get things running right? ;)
 
Mr. Stryker - Could we get a FRAPS screenshot to prove that you can get 100fps indoors and 60fps outdoors with your settings maxed like they are? I don't believe even an X1900 CF setup could do that.
 
Mr. Stryker said:
LMFAO, wtf is this. My system runs Obby at the highest, HDR + 6xaa + 16af just FINE at 1680 X 1050. It's mostly at 45++ FPS outside, 100+ indoors. That's bullshit, rofl. I should know cuz I play it on my computer and my friend with a similar rig with a 3800+ runs it so smoothly. Where'd they get 27 FPS? GEEEEE, UMM.... K
I run at 1680x1050 4x AA 8x HQ AF + HDR with most settings maxed or near max(including grass). My game is VERY playable, I think I get 30-40 pretty much everywhere, it is rare that I get below 30. I think your fps estiments are a little off...

My rig?

X1900XTX 690/800
3700+ 2.4Ghz
2 GIG ram

I thought I read somewhere that the chuck patch cant do 6x AA in oblivion...
 
Bona Fide said:
Mr. Stryker - Could we get a FRAPS screenshot to prove that you can get 100fps indoors and 60fps outdoors with your settings maxed like they are? I don't believe even an X1900 CF setup could do that.

It would definitely be shocking, when three major review sites all get substantially lower than his " mostly at 45++ FPS outside", all using different demos.

This website's review even shows it dipping below 30fps fifteen times over a ten minute FRAPs run at 12X10, with no AA, using a much more powerful CPU than he has!

Sort of hard to argue with that- and IMO any time it's dipping below 30fps that much it's "unplayable". I do not like to have the motion stutter. :(
 
Suflex said:
Rollo, I want to say this nicely, your constantly pro-NV posts are not doin' them any favors, as they come off as fanatic and amateurish, even if you were on this marketing program you still should re-weigh what you say as to have some value.., if u like NV then fine you should support them not start wars in their name!!

Oh before u cry fanATIc, i've had 9600SE, 6600, 6800GT, 7800GTX (current)

For the OP the bottom line is NV hardware can not support FP16 blending with multisampling AA, the ATI X1K can do that, realistically, only the X1900 series have the muscle to pull off HDR+AA with good frame rates

Thats about as nice as anyone could put it I guess. I really wish [H]ard would just give him the boot already. According to Kyle, he hasn't broken any rules so we'll have to put up with more pro Nvidia propoganda I guess. His posts aren't the slightest bit objective.
 
Rollo said:
Are there some SM3 games the 7900GTX can't run at adequate fps?

according to you, it cant run oblivion. If you think that the 1900 is unplayable at 1600*1200 with hdr and AA @ 20fps, the 7900gtx gets those frames without AA. looking at your firingsquad link.
 
Coolmanluke said:
Thats about as nice as anyone could put it I guess. I really wish [H]ard would just give him the boot already. According to Kyle, he hasn't broken any rules so we'll have to put up with more pro Nvidia propoganda I guess. His posts aren't the slightest bit objective.

I wasn't aware "objectivity" is in the Terms of Service? In America, we're allowed to have preferences and opinions.

Back on topic- I'd further state that history shows us that one vendor only features that must be coded for have a slow rate of adoption. Anyone remember the 3DC texture compression that ATI said would revolutionize gaming? Or True-Form?

When the there is an installed user base of more than one brand of of cards that support this, more devs will. By then, these HDR+AA cards will seem slow.

I bet Brent would back me on this one, if he cares to interject?
 
Woot- a whole 27 fps! Can't wait to add some AA to that, maybe get it down to 20fps!

If you think HDR+AA on a XTX is too slow well, throw your 7900GTX out of the window then because it's even more slower without AA!

27303.jpg


http://www.pcinpact.com/articles/a/186/3.htm
 
Spank said:
according to you, it cant run oblivion. If you think that the 1900 is unplayable at 1600*1200 with hdr and AA @ 20fps, the 7900gtx gets those frames without AA. looking at your firingsquad link.


I would agree that based on that graph, no single card is really capable of even running Oblivion at 16X10 at what I'd consider acceptable performance.

So my point stands- EXR HDR+AA is a tech demo. It can be run acceptably on two games-
Two year old Far Cry that we all played long ago, and Serious Sam2. If acceptable HDR+AA performance on those two games is important to a buyer, they should go for the R580.

I played those two already, so I'll wait for something better as my 7800GTX SLI and 512 7800GTX handle my games at the 19X12 and 19X14 resolutions I need to run at. :)


Let's put it this way:
If you're a buyer now, and HDR+AA is an important buying consideration, do you REALLY want a card that can only do it on a two year old game and a game with seriously cartoonish graphics?

Or might it be worth your time to wait for next gen HDR+AA, which will almost certainly be more powerful?
 
Apple740 said:
If you think HDR+AA on a XTX is too slow well, throw your 7900GTX out of the window then because it's even more slower without AA!


http://www.pcinpact.com/articles/a/186/3.htm

Thanks for helping make my point Apple740. :)

BTW- how many people with high end video cards don't have either:
1. LCD monitors with at least 16X10 native resolution
2. CRT monitors not capable of 16X10 and up

My monitors are 19X12 native res and 19X14 @76Hz. I seriously doubt anyone spending $400-$600 on graphics cards will have a 12X10 monitor? Wouldn't be much point in the graphics card, would there?
 
Apple740 said:
If the above fps numbers don't satisfy you you can always consider Crossfire.

I would agree SLI and Crossfire offer much better performance at Oblivion, but won't get into current Crossfire problems.
 
Rollo, your posts don't come off as those of an enthusiast, but rather a pawn. Maybe being an accomodated !!!!!! would be ok, but why they let stay an Nv shill that spent an undisclosed amount of time on this forum "in the closet" is beyond me. You've at some point, who knows how many times, given the wrong slant because you get gear from Nv - unknown to the reader. That there is lame, sad, and misleading.

I use a light 2x in Oblivion, and yeah I get some low frames outside. Higher "unplayable" settings than what you are getting though I'm sure. You play this game at the fps you are so willing to disregard. That makes his point extremely relevant my friend, and yours what you'd expect. It's paid for after all. Your intital excuse making comments serve no other prupose than to derail the thread into ATi vrs. Nv. To which it's tempting to go along, and ask if any of this stuff also changes the fact that the distance AF on Nv cards looks comparatively, like Shit with a capital S. My demands are uncompromising. Bullshit walks. Sooner or later you run out of excuses, and only !!!!!!ism remains.
 
Rollo said:
I would agree SLI and Crossfire offer much better performance at Oblivion, but won't get into current Crossfire problems.

Spending 1000$ on a GTX SLI rig and still having no HDR+AA is a "problem" that will never be solved. ;)
 
Apple740 said:
Spending 1000$ on a GTX SLI rig and still having no HDR+AA is a "problem" that will never be solved. ;)

OK, I guess if we've switched gears to Crossfire as an offshoot of HDR+ AA (as it's necessary in Oblivion)

1. Lack of profiles and render mode flexibility, as referenced here:
http://www.beyond3d.com/forum/showthread.php?t=28310&highlight=Crossfire+profiles
2. Millions of people have SLI motherboards already, I doubt 100s have a Crossfire motherboard
3. Higher power requirements and heat
4. Necessity of replacing annoying stock HSFs and voiding your warranty unless you're lucky enough to have HIS ICEQ cards
5. Strange Dongle/Master/Slave setup
6. D3d defaults to non geometry scaling tiling mode
etc.

Crossfire does have sometimes a bit nicer AF, much higher performing super AA, and yes, EXR HDR+AA in these three games.

Cons CF >>>>>>>>>Pros CF

Which is probably why very, very, very few people are spending money on it.
 
All I know, is that if I had actually realized before how much better the graphics would have been with Oblivion with HDR+AA, let's just say I wouldn't have a 7900GT CO SC right now.
 
Well, I'm not trying to derail this thread, but Rollo, please keep your bias to yourself. I personally don't care if someone wants an nVidia or an ATi, in my opinion you should go for the least expensive, most readily available card you can. This time around it was the X1800XT for me, and it had 512MB, so I bought it. Anyway, since you are on the topic of Crossfire...

http://www.xtremesystems.org/forums/showthread.php?t=66685

#1 score - CF

http://www.xtremesystems.org/forums/showthread.php?t=88168

#1 score - CF

http://www.xtremesystems.org/forums/showthread.php?t=92232

#1 score - CF

Of course, one could count 3DMark 03, where SLI wins out, or 3DMark 01 where CF wins, but it really doesn't matter.

Once ATi gets most of the bugs out of Crossfire, it will be a beastly performer. Until then, yes, most people will go SLI. I personally don't see any point in purchasing a few extremely expensive cards.

This was not a flame, not trying to spark an argument, just saying that Crossfire is NOT as bad as you made it out to be. As I said, I don't care if you are biased, but there are people here that want a fair opinion that takes into account both sides, and you don't give them that.
 
leSLIe said:
oh!! shame on nVidia :mad: i´m gonna throw all my nVidia hardware right now!

give it to me so i can sell them then =P

yea i noticee than. when i try to get aa with hdr on oblivion it told me to disable hdr first to get 4x AA -.-
 
I can vouch for the default super tiling mode not providing any impact on performance. Age of Empires III and Ghost Recon Advanced Warfighter (which is brand new so no profile in the drivers yet) seem to be running in tiling mode on an X1900 XTX CrossFire setup. However, I'm not experiencing any performance improvements with CrossFire. I'm running the games at 1920x1200 with all in-game settings maxed out so it is GPU limited for sure. I've informed ATI. Maybe Cat 6.5 will show improvements. Note I haven't tested SLI yet in these games.
 
Status
Not open for further replies.
Back
Top