Is anyone SM3.0 / DX9.0c compliant?

This just goes to show how rediculous fanb0yism actually is. Let's all jump on a bandwagon and flame the other side for something that in reality doesn't even matter.:rolleyes:

What it boils down to, every time, is IQ+fps+price. ATM, the x1k cards have the IQ lead and maybe a slight fps lead. The 7800 cards have the price/availability advantage. You decide which is better for your purposes. Now let's just go play some good looking games and stop bitchin.

The ironic thing is, some people are asking for MORE restrictions from MS. I'm not saying stricter standards for WGM and all are a bad thing, it's just not every day when we say "I really hope The Man is more strict with their standards from now on."
 
I tried running the DCT tests here a while back, it crashed both times, heh. It literally takes a few days for it to run JUST the Shader Model tests, it is very intense.

I really don't see it as a big deal though, NV's and ATI's drivers are both WHQL certified in the end.

I think i'll try running the tests again though, but it is going to take a while.
 
Brent_Justice said:
I tried running the DCT tests here a while back, it crashed both times, heh. It literally takes a few days for it to run JUST the Shader Model tests, it is very intense.

I really don't see it as a big deal though, NV's and ATI's drivers are both WHQL certified in the end.

I think i'll try running the tests again though, but it is going to take a while.

Took the inq guy 150 hours :D
 
yeah, that's a long time

i can't have my review system tied up for that long

but i can try it on another system, maybe it will be stable enough, for me the software program itself crashed the last two times i tried it, heh
 
Brent_Justice said:
yeah, that's a long time

i can't have my review system tied up for that long

but i can try it on another system, maybe it will be stable enough, for me the software program itself crashed the last two times i tried it, heh

I started it on my main rig last night, after an hour or so I stopped to play quake 4, there are too many good games out to waste hours on that. I put my other 6800 gt in my secondary rig and left it on overnight but it crashed, i'm not sure if it was because of that app or something else.
 
Psylenced said:
The real world tests tell all. SM 3 done right or not without the software to make the comparison it doesnt matter.

Furthermore even if the X1K line of cards does perform on par with 7800's, where the ball was dropped on ATI's behalf was with the Crossfire implementation. I mean a cable bridge? I like my refresh rates thanks, and if you do too youd go SLI so you werent limited by a bridge cable. Furthermore a Xfires ability to run different cards simultaneously is moot, because the faster card automatically scales down to the speed of the slower card. So if you thought youd get off buying a Master Card X1K card while retaining your 800 series, youve cheated yourself out of money and performance on the X1K.

No I am not a !!!!!!, I go with the hardware that performs best at the time I buy, be it green or red. Nvidia holds the crown, and until the XT launches they continue to do so.

yikes

1. The X1K's aren't limited to the refresh rate the x800's are, and its not because of the "cable bridge"

2. You can't run CF with a x800 and a X1800, they already stated this, you can run a X800 pro with a X800XT (and the performance would scale to 2xX800pro's) and no you didn't cheat yourself out of price/performance, if you were like me and bought a x800pro when they were released and just happend to have CF out at the time (which would have been nice) you could buy a XT right now or what ever is available or cheaper and just get more performance if thats the road you wanted to go, since that isn't the case we'll see how the case is for the X1000 series

3. The 7800's are faster in just about everything compared to the X1000's right now, but as far as we know have no more headroom in them, just over clocking, the X1000's have increadibly better IQ when compared (HQ AF, no angle dependant optimizations, and full time FP32) and a possibility to have a huge amount of speed increases for game specific optimizations with their new MC


the SM3.0 arguements are really crap, right now NONE show performance loss, none show degradation in IQ, basing your purchase on weather who has the better implementation is just asking for a head ache and really just shows how blown out the water the argument was weather the x800 would be worth it seeing as the 6800's had sm3.0

until it shows directly that SM3.0 is going to be nice for us, any arguements talking about its possible "speed increases" or IQ is really just dust in the wind
 
I don't know if I'm known for it, but I have been criticizing ATI for not being fully DX9 compliant.

Now I find I should have been criticizing ATI and nVidia for not being fully DX9 compliant.

This wouldn't be a problem if Microshaft had stuck with OpenGL like they intended in the beginning. But nooooo, Billy won't be happy until the word "Microsoft" is synonymous with the phrase "all of worldwide commercial industry."

"ATwIts, nVidiots... I'm the guy with the gun." -Slightly modified quote of Bruce Campbell from Army of Darkness.

To be brutally honest. We should all be on the same side in any issue. Consumers on one side and business on the other. If Matrox came out with a SM3 compliant card right now that was 1 fps slower than the slower of the ATI and nVidia flagships, they would make a killing just because of the marketing.
 
John Reynolds said:
Anyone notice the lack of full 3.0 compliance in their games? Developers will surely work around these minor failings for the parts of both companies, so like the VTF situation this is a real 'none' issue IMO.

Not being compliant to a spec just makes it a pain for developers to have to work around the "minor failings". That increases cost to make a game and time to market.

Besides, no company should advertise compliance with a spec its products are not compliant with.
 
I have a problem with anyone saying they are compliant with a standard and their hardware isn't. It doesn't matter if it seems like everything looks fine and there is performance.

Standards are out there for a reason. Do you want to go back to the times where every game had to have drivers for every vid card to get it to work? This is why standards came about for graphics. Allowing the graphics card companies to use something other than the standards and letting them say they are compliant is a bad road to take. How long will it be before they just start ignoring what they don't want to mess with in the standards and the companies go in opposite directions?

This is why I don't care for a company saying they are compliant when they really aren't. Standards let you know what you are getting and let the game devs program for the standards which gets games out the door faster and cheaper.

 
chrisf6969 said:
Just a funny turn of events:

First ATI, tries to ingore SM3.0 and say how unimportant it us.... yadda yadda yadda...

Then they tout SM3.0 done right.

Then Nvidia, pisses in ATI's cereal with SM3.0 NOT done right, b/c its missing Vertex Texturing.

Now ATI is pissing right back saying Nvidia didn't implement SM3.0 right either.

Ok, so now it looks like NEITHER of them did it right? SHIT, now what do I buy..... Matrox's Paraphalegic? :)

If you feel you need a warning for no flame war, you're probably inviting one. Keep it clean and we'll keep the discussion. - p[H]

Well here is my question, who wrote the industry standard on SM3.0? Also if someone does it their way, how can it be done wrong? Basically I look at it like this, nVidia and ATI are rivals, so I look at like who's SM3.0 is better. I don't compare ATI’s or nVidia’s to the text book definition.

Personally I think nVidia does it better. Yes both might cut corners, but really who cares as long as the end result is a nice looking image with good performance. If you have a Ferrari with a Brigg's and Straton engine in it, that keeps up with a Ferrari engine and puts out the same hourse power and speed, what really do you care?

The only game that I've personally played where SM3.0 actually makes a difference in IQ is Splinter Cell Chaos Theory. I have yet to run across a game that has SM3.0 in it and it made a difference in IQ.
 
[BB] Rick James said:
Well here is my question, who wrote the industry standard on SM3.0? Also if someone does it their way, how can it be done wrong? Basically I look at it like this, nVidia and ATI are rivals, so I look at like who's SM3.0 is better. I don't compare ATI’s or nVidia’s to the text book definition.

Personally I think nVidia does it better. Yes both might cut corners, but really who cares as long as the end result is a nice looking image with good performance. If you have a Ferrari with a Brigg's and Straton engine in it, that keeps up with a Ferrari engine and puts out the same hourse power and speed, what really do you care?

The only game that I've personally played where SM3.0 actually makes a difference in IQ is Splinter Cell Chaos Theory. I have yet to run across a game that has SM3.0 in it and it made a difference in IQ.


because when i buy a Ferrari i expect to see that sexy as hell world renowned engine in the back not some Briggs & Straton crap that puts out the same hp. i bought a Ferrari, i sure as hell expect one with the full package that comes with that name. same could be said for this situation, i pay for SM3 and i expect it to be there and in full. i think im going to go buy an old voodoo or something :rolleyes:
 
Personally I think nVidia does it better

How? ATi is mixing one thing (VTF) Which can be done via specific coding and may be faster. Nvidia is suposedly missing allot more from all the talk of the DCT Tests also ATi has better image quality with the AF, Better Quality HDR.
 
Remember when the X800 and the 6800 came out? The 6800 could support Shader Model 3.0 where the X800 could only support Shader Model 2.0. When this point came up, everybody was overlooking it because the only thing that mattered was how games looked at the time.

Jump ahead several months...
New games started coming out that could handle Shader Model 3.0 (Far Cry, Splinter Cell: Chaos Theory, etc.) Suddenly having SM3.0 was a big deal and ATI was frowned upon by not having this all important feature on their cards.

Jump ahead to now...
The roles are reversed, nVidia has taken out some of the higher end SM3.0 functions that are present in ATI's cards. Sure, everybody is overlooking it because F.E.A.R. and Quake 4 look amazing even without those items. But when the games start coming out that have features disabled if you're using nVidia cards, or that ATI cards get better image quality out of... you'll change your tune pretty quickly and you'll start damning nVidia for cutting corners.
 
tornadotsunamilife said:
Took the inq guy 150 hours :D

I'd rather /kill

than waste 150 hours of PC time to figure that out! ;)

A more worthwhile use of a 150 hours would be to test 150 different games @ 1 hour each for compatibility!! Though that 150 hours would probably take manning the PC, not just letting tests run. :(
 
Normally MS DCT are trustworthy but the DCT from the may update which is what everyone is pointing to is different than the sm3 dct from dec 04 which was the pervisious one... my question is was nvidia complient with sm3 when they were the only one with it and did they become non complient when ATi got in bed with MS and MS told ATi they need the shader lengths? or did Nvidia simply not cut down the shaders from day one like the non funtioning pure video?

One interesting thing out of this is ATi figured out how to pass shader through the vertexs faster as being able to call back to just used hardware will allow some very interesting shader commands...
 
Does it really matter if either company is fully DCT complient? What does that program have to do with software to hardware compliance. It seems like all games are running on both cards just fine.

Its just MS's way of saying noone is up to thier test standards, which is bs, as some of the failed tests in DCT the nV rasterier is superior to the DCT test.
 
Can anyone confirm that the Nvidia cards cannot run these features due to hardware rather than due to drivers?

The article itself shows some really old sets of drivers in the slide, which is totaly stupid, until I read otherwise from a reasonably good source that this is a hardware limitation I think the whole argument is moot.

It's what the hardware is capable of which is important, I have no doubt half these tests for SM3.0 are for features we dont yet use and so are not needed to be supported yet in the driver sets.
 
I heard basically that the reference rasterizer is inferior to nvidia's own and this is why some test are being failed. I'm trying tojust give you the basics of it, search google for better knowledge ;)
 
The current debate over WHQL waivers and SM3 compliance forces us to take a public stance regarding these issues. Microsoft is the ultimate judge of pass/fail for a particular hardware/software combination. NVIDIA hardware passes WHQL as judged by Microsoft, and fully supports all the features of SM3. Other hardware also passes WHQL for SM3, while not necessarily fully supporting all of the SM3 features.

When a vendor fails to implement a required feature (such as Vertex Texture), typically Microsoft would either fail to certify the hardware/driver, or issue a waiver for the lack of the feature. We would not expect tests to be changed or the interpretation of the DirectX specification to be changed.

A failure in a WHQL DCT test is by itself not an indication of hardware or driver flaws. There can be several causes for this, including test errors, and hardware mismatches with refrast (the Microsoft software reference rasterizer). A mismatch with refrast can be due to the hardware producing superior, inferior, or simply different images than refrast.

WHQL failures and errata by themselves are a poor and inaccurate indication of Windows and DirectX compliance. Microsoft is the sole arbiter of WHQL passage, and further questions on this topic should be directed to Dean Lester.
NVIDIA's response. Although I read this on the Inq of all places.
 
Rather curious watching certain people not making a big stink out of this. For consistency's sake you think they'd be starting numerous NVIDIA not 3.0 compliant threads.

Again, mountain out of a molehill for those who do think this is a big deal. And the timing of this hitting TheInq when it did? Very curious to me.
 
Meh their response...
It would void any insult to either company, but in the end it is still odd that Nvidia Fails so many tests....
 
John Reynolds said:
Rather curious watching certain people not making a big stink out of this. For consistency's sake you think they'd be starting numerous NVIDIA not 3.0 compliant threads.

Again, mountain out of a molehill for those who do think this is a big deal. And the timing of this hitting TheInq when it did? Very curious to me.

As I posted earlier in this thread NV's drivers are WHQL certified from MS - http://www.nvidia.com/object/winxp_2k_81.85.html so yeah, I don't see it as a big deal.

Version: 81.85
Release Date: October 20, 2005
WHQL Certified

For the record I am running my own DCT tests, but as you know it can take quite a while, if it doesn't crash on ya.
 
I was under the impression that ATI drivers were also fully certified by MS.

Anyways looking forward to your results Brent. As always.
 
{NG}Fidel said:
Meh their response...
It would void any insult to either company, but in the end it is still odd that Nvidia Fails so many tests....

When "failing" can mean any of these things:
"A failure in a WHQL DCT test is by itself not an indication of hardware or driver flaws. There can be several causes for this, including test errors, and hardware mismatches with refrast (the Microsoft software reference rasterizer). A mismatch with refrast can be due to the hardware producing superior, inferior, or simply different images than refrast."

When it's consider a failure to produce superior images, then you need to know the exact nature of each "failure" before making such a comment...

Terra...
 
Terra you sang a diffrent tune during the Vertex Texture Fetch time.
I agree that if they can make the image look the same its fine but I just find it amazing that ATis Vertex Texture Fetch Caused so much ruckus and this doesnt get anyone slightly peeved.
 
Brent_Justice said:
As I posted earlier in this thread NV's drivers are WHQL certified from MS - http://www.nvidia.com/object/winxp_2k_81.85.html so yeah, I don't see it as a big deal.



For the record I am running my own DCT tests, but as you know it can take quite a while, if it doesn't crash on ya.
What does WHQL certified have to do with anything related to this thread? Just because its WHQL certified doesnt mean its up to specs. ATi is WHQL certified for there DX9 cards and they dont have SM3.0(talkin about the somewhat old ATi cards). Geforce FX cards were WHQL certified but looked how horribley they turned out.

Terra you make me laugh so hard. You were knocking on ATi for one thing and now your sticking up for nvidia when theres a fairly big problem with there cards. If everyone is failing the same tests than its probley not the software you test with.
 
pArTy said:
What does WHQL certified have to do with anything related to this thread? Just because its WHQL certified doesnt mean its up to specs. ATi is WHQL certified for there DX9 cards and they dont have SM3.0(talkin about the somewhat old ATi cards). Geforce FX cards were WHQL certified but looked how horribley they turned out.

Terra you make me laugh so hard. You were knocking on ATi for one thing and now your sticking up for nvidia when theres a fairly big problem with there cards. If everyone is failing the same tests than its probley not the software you test with.

Neither your nor I know if the failures are due to inferior or better image quality.
I would wait for Brent to finish his tests before drawing any conclusions ;)

Terra...
 
yes.

I saw this on AT ("Nvidia responds to SM3 claims" in the video forum):
The current debate over WHQL waivers and SM3 compliance forces us to take a public stance regarding these issues. Microsoft is the ultimate judge of pass/fail for a particular hardware/software combination. NVIDIA hardware passes WHQL as judged by Microsoft, and fully supports all the features of SM3. Other hardware also passes WHQL for SM3, while not necessarily fully supporting all of the SM3 features.

When a vendor fails to implement a required feature (such as Vertex Texture), typically Microsoft would either fail to certify the hardware/driver, or issue a waiver for the lack of the feature. We would not expect tests to be changed or the interpretation of the DirectX specification to be changed.

A failure in a WHQL DCT test is by itself not an indication of hardware or driver flaws. There can be several causes for this, including test errors, and hardware mismatches with refrast (the Microsoft software reference rasterizer). A mismatch with refrast can be due to the hardware producing superior, inferior, or simply different images than refrast.

WHQL failures and errata by themselves are a poor and inaccurate indication of Windows and DirectX compliance. Microsoft is the sole arbiter of WHQL passage, and further questions on this topic should be directed to Dean Lester.
http://www.the-inquirer.com/?article=27141

I have an opinion, but i've flamed ATI enough this week. ;) I'm looking forward to getting the $121AR RX800-TD128E (12 pipeline) PCI-E card I ordered this morning. http://www.hardforum.com/showthread.php?t=968597
 
My opinion exactly. As long as microsoft says they are complient then they are. If they pass it with microsoft then that's all that matters. They both have different implimentations but both are great in the end.
 
I love DCT. With AA on, I fail a lot of the Alpha Blend tests. With AA off, the image is shitty, but AT LEAST I PASS THE WONDERFUL AND INFALLIBLE DCT!
 
robberbaron said:
I love DCT. With AA on, I fail a lot of the Alpha Blend tests. With AA off, the image is shitty, but AT LEAST I PASS THE WONDERFUL AND INFALLIBLE DCT!

"And you shall know the truth. And the truth shall set you free."

So better IQ can make you fail the DCT tests... so was ATI insulting nVidia or were they complementing nVidia with their accusations? At this point, I can't tell.

Don't get me wrong, I hate when there is a lack of compliance to intra-industry standards (which is why Microshaft is so detestable to me), but to make false accusations at your competitors just because your product has been shown to be flawed and your marketing has been shown to be lies; that's Intel-like business practices. You just can't get much lower than that in commercial industry.
 
WHQL certifies DX 9.0 sompliance, not the various SMs... so saying microsoft WHQL certified is a bit off topic isn't it ? microsoft never said they certified nvidia SM3.0, but nvidia DX 9.0... SM3.0 is obviously optional since r300 is dx9, and doesnt support SM3.0...
 
LyCoS said:
WHQL certifies DX 9.0 sompliance, not the various SMs... so saying microsoft WHQL certified is a bit off topic isn't it ? microsoft never said they certified nvidia SM3.0, but nvidia DX 9.0... SM3.0 is obviously optional since r300 is dx9, and doesnt support SM3.0...

It certifies DX9.0c the latest DX SDK. Which includes all shader models up to 3.0
 
One of the flaws for me in the whole non-compliance arguement, is that the developers are using the cards that are non compliant to do the development, not some emulated perfectly compliant card.

So the games are built to be compliant with the non-compliance.

So we lose nothing.

Problem solved.
 
MartinX said:
One of the flaws for me in the whole non-compliance arguement, is that the developers are using the cards that are non compliant to do the development, not some emulated perfectly compliant card.

So the games are built to be compliant with the non-compliance.

So we lose nothing.

Problem solved.

While this is true for compatibility and requirements testing (basic quality assurance), it is not necessarily true for game development. When most games begin development, it is long before their target hardware is taped out, so they use either emulation hardware or High level (Quadro or FireGL) video hardware.
 
razor1 said:
It certifies DX9.0c the latest DX SDK. Which includes all shader models up to 3.0

Key words being "up to 3.0", which could mean that it fully supports 2.0, but not quite 100% support of 3.0. As said before, ATI's X300 is DX9.0c compliant, but doesn't support SM 3.0.

If either company supports, say 97% of SM 3.0 requirements. Does that mean they are SM 2.97 compliant? And they just round up to the nearest whole number? :D
 
razor1 said:
It certifies DX9.0c the latest DX SDK. Which includes all shader models up to 3.0

Let's close the loop with my previous statement -> "Which includes all shader models up to 3.0", so SM3.0 support is therefore not necessary for WHQL DX9.0c compliance, which means that NVIDIA being WHQL DX9.0c certified is completly off topic, since we are talking about 1 particular SM...
 
Back
Top