PS 3.0 Important now?

Joined
Jul 8, 2004
Messages
1
Hello. I really want to buy the x800 xt p.e. But then I started looking at different articles at theinquirer.net and got to thinking. Is Pixel Shader 3.0 that important now? Because if it is then maybe I should buy the 6800 ultra.

I am spending 500 dollars on a card and I don't want to buy this thing and find out about 6 months from now that the majority of games out from then on are going to have ps 3.0 support.

I have seen some aritlces here that say Farcry ps 3.0 settings don't make a difference except in optimization, but what about doom III or other games in development that I haven't seen?

I also read in PC Gamer that Directx 9.0c would possible make the 6800 ultra better then then x800 xt pe due to better support of the new directx.

Now, I saw on another forum that the games in the future which fully support ps 3.0 probably won't run too great on the current graphics cards. So I'm thinking this is an indicator just go for an awesome proven card of today (x800 xt pe) then next upgrade look for the PS 3.0 card that is the best. But I do want this to last for 3-5 years

With all this in mind, should I still go for the x800 xt pe or buy the 6800 ultra?
 
The majority of games that will have PS 3.0 support have it solely for optimizing purposes.

If you look at Far Cry with the shader model 3 patch the x800XT PE still outperforms the 6800 ULTRA so it´s not that important even though there is real performance gains it still don´t make the 6800 ULTRA outperforming the X800XT PE. Not even with brilinear enabled.

So I wouldn´t be worried about getting a X800XT PE really. Neither a 6800 ULTRA either.

I think the X800XT PE in shader model 3 enabled games or not will outperform the 6800 ULTRA in the forseeable future.

Especially if you bring 3dc in the calculation. That will help the image quality or performance greatly. Then the 6800 ULTRA to counter that have a higher level of HDR and perhaps a more efficient way of doing displacement mapping (that is yet to be seen though ;))

Anyway my thought is that X800XT PE will continue to be the better performing card at high res with lots of aa and aniso. However at some point it´s very possible the shader model 3 support nVidia has will allow it to start performing better. But I would think we are quite far from that but I can´t really guess when that is ev going to happen it´s not even certain it will considering ATI have support for longer shaders too.
 
well said.....i wish more people would do the same instead of ignorantly flaming the other card.......
 
oqvist said:
The majority of games that will have PS 3.0 support have it solely for optimizing purposes.

If you look at Far Cry with the shader model 3 patch the x800XT PE still outperforms the 6800 ULTRA so it´s not that important even though there is real performance gains it still don´t make the 6800 ULTRA outperforming the X800XT PE. Not even with brilinear enabled.

Uh the 6800 ultraEE beats the xtPE at almost all resolutions.


So I wouldn´t be worried about getting a X800XT PE really. Neither a 6800 ULTRA either.

I think the X800XT PE in shader model 3 enabled games or not will outperform the 6800 ULTRA in the forseeable future.


I think you're smoking something since the 6800ultraEE literally beats the pe in almost every benchmark possible.


[quoteEspecially if you bring 3dc in the calculation. That will help the image quality or performance greatly. Then the 6800 ULTRA to counter that have a higher level of HDR and perhaps a more efficient way of doing displacement mapping (that is yet to be seen though ;))[/quote]

Have you seen the differences 3dc does with normal maps? It's literally non-existant. Do you even know what HDR and displacement mapping do? Why are you putting those two with 3dc?


Anyway my thought is that X800XT PE will continue to be the better performing card at high res with lots of aa and aniso. However at some point it´s very possible the shader model 3 support nVidia has will allow it to start performing better. But I would think we are quite far from that but I can´t really guess when that is ev going to happen it´s not even certain it will considering ATI have support for longer shaders too.


Providing the 6800 series beats the x800 series in pre-beta drivers, doesn't cheat at filtering and support SM3, I'd give nvidia this one.

Also, you know SM3 has an unlimited shader count, where ATi's ps_2_b is stuck at 1,536, I think.

Please research what you say before you throw it on here.
 
Eeeeh I am talking about the 6800 ULTRA not the 150$ more expensive 6800 ULTRA EXTREME that is only being produced in limited numbers?

6800 ULTRA and X800XT PE is priced 500$ both of them it´s them I am compairing against.

Perhaps you should do a bit more research??? ATI don´t cheat in filtering who gave you that idea ;)

Oh and you don´t need more than 1000 shader instructions to produce finding NEMO for example. Far Cry currently uses 96 shader instructions tops that should say it all how important the 65 kb shader instruction limit is ;)
 
dude.......3dc isnt the greatest thing no...but if what im reading is right sm3.0 hardly does a damn thing.......and IMHO hdr is about freakin worthless...........i still think the cards are about equal...they each have there trade offs but all these new "features" are hardly doing a thing as of yet.....sm3.0 still hasnt matured yet and it will probably make the 6800 a better card in the future but RIGHT NOW its crap....just adds a slight performance increase......
 
oqvist said:
Eeeeh I am talking about the 6800 ULTRA not the 150$ more expensive 6800 ULTRA EXTREME
not trying to get involved with the lil fight or w/e lol but the EE is 550, 50$ more than the msrp of the ultra ;)
 
try to keep in mind that in a little over 12 months DX10 parts will be out. and this focus on 9.0b vs 9.0c will be left in the dust.

whatever features you get today will be performance obsolete if you are serious gamer.

if your buying a video card to keep for a few years,, id find a real nice and quite one
 
joobjoob said:
try to keep in mind that in a little over 12 months DX10 parts will be out. and this focus on 9.0b vs 9.0c will be left in the dust.

whatever features you get today will be performance obsolete if you are serious gamer.

if your buying a video card to keep for a few years,, id find a real nice and quite one

Isnt DX10 being held back until the next version of windows (Longhorn) is ready to ship?
 
perhaps this link will help you in your decision. mind you: *i have no clue if all thee scores are valid*, but i think most people think the x800xts are crap and overhype SM3.0 performance for todays apps. they are both great cards, keep that in mind. look at what resolutions you want to play and if you play with AA and AF cranked up, then just decide - either card will make you happy. in fact - they are so close i'd rather go for IQ, noise level, heat, power consumption, free slots in case of a double slot cooling solution and the likes...

linkage: http://www.rage3d.com/board/showthread.php?t=33768218
 
Nah I think the cheapest 6800 ULTRA EXTREME is at least 100$ more expensive than the cheapest 6800 ULTRA... At least here in Sweden there is a 150$ difference but then I wonder if the Extremes will be arriving in late august or september over here since North America has priority.

The suggested retail price is at least 100$ more on the ULTRA EXTREME.

And 3dc normal maps are hear to stay and that´s why 3dc will be of such importance.

Shader model 3 has already showned to be beneficial to the 6800 ULTRA cards. They are a lot closer to the X800XT PE in Far Cry now than they where before. Without aa and aniso even the 6800 ULTRA beats the X800 XT PE frequently.

So shader model 3 isn´t useless even though I don´t think it was worth wasting 60 million transistors on the 6800:s cards. I am sure they could pull higher clock numbers without it and if they whent 24-bit like ATI they could very well outperform it anyway. But it´s a cool feature and do have it´s benefits. If it outweights it cons is yet to see though :).
 
The 1.3 patch for FarCry is gonna require 32bit blending for the new effects put into that patch ;)
Where does that leave ATi? :)

Terra...
 
PS 3.0 will be good eventualy as will 3Dc!But at the moment If i was you i would just either try and find a friend who knows a bit about computers. Or just research your self and make your own mind up, as all you will get here at the moment if you hadnt already noticed is rampant Fangirls trumpeting the 6800 as the second coming of christ and SM3 IS god? :rolleyes:
 
Yeah, I think the new cards are pretty much balanced, and by that I mean is that anyone, except maybe extreme benchmarkers, will be happy with any particular choice of card.

SM3.0 is going to hit the mainstream soon, but not before SM2.0 has had it's day. Nvidia's SM2.0 performance was crapola, I think it was a cynical move to simply jump in with SM3.0. That said, nothing is stopping ATI from slapping SM3.0 into the X800... oh yeah nvidia beware.

So it really comes down to which company you think deserves your money more, really. :eek:
 
oqvist said:
Nah I think the cheapest 6800 ULTRA EXTREME is at least 100$ more expensive than the cheapest 6800 ULTRA... At least here in Sweden there is a 150$ difference but then I wonder if the Extremes will be arriving in late august or september over here since North America has priority.

The suggested retail price is at least 100$ more on the ULTRA EXTREME.

http://www.evga.com/6800EE/default.asp
like i said, 550$

Cost of the e-GeForce 6800 Ultra Extreme Edition is $549.99

thats 50 more than the MSRP of the ultra. and i think somebody over at nvnews has one.

@trapine said:
PS 3.0 will be good eventualy as will 3Dc!But at the moment If i was you i would just either try and find a friend who knows a bit about computers. Or just research your self and make your own mind up, as all you will get here at the moment if you hadnt already noticed is rampant Fangirls trumpeting the 6800 as the second coming of christ and SM3 IS god? :rolleyes:

wtf are you talkin bout buddy, if you actually read the thread, nobody is being fanboy/girl-ish in this thread. you always sarcastically exaggerrate, its like the 4th time you've said that "6800 fangirls" talk about the coming of crist, godlike drivers, or a ledgend of the 6800. nobody said the 6800 is the coming of any christ, or even remotely similar to this. where do you get your information from? you never have any quotes or proof. i think this is the 2nd or 3rd time ive told you this,theres no inferior complex disorder, im just sick of your smack, but your obviously not getting it. i could pull a dozen quotes from you with crazy remarks. if anybody sounds like a fanboy here, well...its you.

this is just real talk.
 
My last 2 vid cards have been ATI, and have really been lovin' them.

However the price/performance of a 6800GT looks VERY inticing. If I was in the market for a vid card right now, that's probably what I'd be looking at. There's a lot of room for driver improvment on the 6800 series.

It's interesting times in VidCard land indeed ;)
 
WalteRr said:
Uh the 6800 ultraEE beats the xtPE at almost all resolutions.


[/b]

I think you're smoking something since the 6800ultraEE literally beats the pe in almost every benchmark possible.

http://www.anandtech.com/video/showdoc.aspx?i=2102&p=5
http://www.anandtech.com/video/showdoc.aspx?i=2102&p=6

Yeah, the 6800UEE is really smoking the XT/PE :rolleyes:

Those were two custom timedemo's from anand and not some demo made by NVidia. You'll find the 6800ultra EE at times faster, slower and equal with the XT/PE, not consistently faster.

Here is some more if you would like: http://www.firingsquad.com/hardware/far_cry_sm30/page5.asp
 
If you look at Far Cry with the shader model 3 patch the x800XT PE still outperforms the 6800 ULTRA so it´s not that important even though there is real performance gains it still don´t make the 6800 ULTRA outperforming the X800XT PE. Not even with brilinear enabled.

OK, just want to make a correction. The above statement is not correct. All latest benchmarks are showing otherwise. Please ignore this misinformation.
 
Why do all these threads turn into a clusterfuck of OMG LOOK THIS CARD RUNS FASTER - NO NO NO LOOK AT THESE IT MAKES THE OTHER CARD LOOK FASTER.

Guess what people, BOTH cards perform VERY well. They are close enough that you could buy either/or and be quite happy with your purchase.

Kudos to Nvidia for releasing a really good product, and kudos to ATI for maintaining a solid product line. Personally I'm waiting for the generation after this one before I kick my 9700pro to the curb :)
 
And where is that? Show me a benchmark (farcry) where the ultra outperforms the XT/PE at high res, high AA/AF consistently.

Oh, you mean these?

Anandtech

1600x1200 -- 6800U Extreme sm3.0 vs. X800 XT PE

mp_airstrip 4x/8x AA/AF -- 51.8fps vs 59.40fps ATI Winner
mp_mangoriver 4x/8x AA/AF -- 49.9fps vs 55.9fps ATI Winner
Research 4x/8x AA/AF -- 50.7fps vs 49.3fps *Nvidia Winner
Training 4x/8x AA/AF -- 47.8fps vs. 48.9fps *ATI Winner
Regulator 4x/8x AA/AF -- 43.3fps vs. 38.9fps Nvidia Winner
Volcano 4x/8x AA/AF -- 50.7fps vs. 45.4fps Nvidia Winner

TechReport

1600x1200 -- 6800U sm3.0 vs. X800 XT PE

Research 4x/8x AA/AF -- 50.71fps vs 52.40fps *ATI Winner
Training 4x/8x AA/AF -- 39.49fps vs. 48.24fps ATI Winner
Regulator 4x/8x AA/AF -- 37.75fps vs. 38.33fps *ATI Winner
Volcano 4x/8x AA/AF -- 52.74fps vs. 51.78fps *Nvidia Winner

1024x768 -- 6800U sm3.0 vs. X800 XT PE

Research 4x/8x AA/AF -- 97.88fps vs 107.17fps ATI Winner
Training 4x/8x AA/AF -- 64.24fps vs. 62.25fps *Nvidia Winner
Regulator 4x/8x AA/AF -- 59.59fps vs. 53.30fps Nvidia Winner
Volcano 4x/8x AA/AF -- 102.93fps vs. 105.33fps ATI Winner

FiringSquad

1600x1200 -- 6800U Extreme sm3.0 vs. X800 XT PE

monkeybay 4x/8x AA/AF -- 49.5fps vs. 50.6fps *ATI Winner
Research 4x/8x AA/AF -- 44.8fps vs. 51.3fps ATI Winner
Training 4x/8x AA/AF -- 42.2fps vs. 50.1fps ATI Winner
Regulator 4x/8x AA/AF -- 38.5fps vs. 40.4fps *ATI Winner
Volcano 4x/8x AA/AF -- 45.2fps vs. 47.8fps *ATI Winner

1024x768 -- 6800U Extreme sm3.0 vs. X800 XT PE

monkeybay 4x/8x AA/AF -- 84.7fps vs 85.1fps *ATI Winner
Research 4x/8x AA/AF -- 83.7fps vs. 108.4.3fps ATI Winner
Training 4x/8x AA/AF -- 80.5fps vs. 95.0fps ATI Winner
Regulator 4x/8x AA/AF -- 38.5fps vs. 40.4fps *ATI Winner
Volcano 4x/8x AA/AF -- 86.2fps vs 97.2fps ATI Winner

Xbit Labs

1600x1200 -- 6800U sm3.0 vs. X800 XT PE

Pier 4x/16x AA/AF -- 41.4fps vs. 49.6fps ATI Winner
Catacombs 4x/16x AA/AF -- 67.6.4fps vs. 75.9fps ATI Winner
Research 4x/16x AA/AF -- 50.9fps vs. 59.5fps ATI Winner
Training 4x/16x AA/AF -- 50.4fps vs. 60.9fps ATI Winner
Regulator 4x/16x AA/AF -- 40.1.5fps vs. 43.6fps *ATI Winner
Volcano 4x/16x AA/AF -- 50.8.2fps vs. 53.2fps ATI Winner

1024x768 -- 6800U sm3.0 vs. X800 XT PE

Pier 4x/16x AA/AF -- 77.7fps vs. 84.6fps ATI Winner
Catacombs 4x/16x AA/AF -- 126.2fps vs. 124.1fps *Nvidia Winner
Research 4x/16x AA/AF -- 95.7fps vs. 104.6.3fps ATI Winner
Training 4x/16x AA/AF -- 105.7fps vs. 126.1fps ATI Winner
Regulator 4x/16x AA/AF -- 75.4fps vs. 71.2fps Nvidia Winner
Volcano 4x/16x AA/AF -- 95.3fps vs 106.3fps ATI Winner

Results are almost on par with Firingsquad, with ATI dominating.

Result: ATI Winner
 
(((((Due to limited supplies on Extreme Edition cards)))) and overwhelming demand, eVGA.com will offer a weekly drawing to purchase this very popular card. Drawings will begin on a weekly basis on June 4th, 2004. All fields are required.

Cost of the e-GeForce 6800 Ultra Extreme Edition is $549.99

What the hell are you smokin bad_boy :rolleyes: You have to go into a DRAW^ just to be the lucky one to buy one=NOT RETAIL?=you aint likely to ever get your hands on the damed thing?
The rest speaks for its self?
The only 2 cards in any real circulation are the 6800GT and X800Pro?Now what you choose is up to you?
both are good cards both have got there strong points I>E X800=3Dc, 6800GT=SM3
(May the force be with you :p)
 
The question at hand is PS3.0 important NOW.

The answer is no, of course not, we don't even have it in any games yet.

The only one PS3.0 is important to right now is Nvidia. It will hopefully eventually make the 6800's on par with the x800's. But it isn't far behind, most gamers wouldn't even know the difference between 85fps and 95fps. So until there is some huge speed or IQ gains (which should be in the future with a lot of game developers jumping in bed with nvidia)

So right now, no it's not important, in the future possibly.
 
Stanley Pain said:
Why do all these threads turn into a clusterfuck of OMG LOOK THIS CARD RUNS FASTER - NO NO NO LOOK AT THESE IT MAKES THE OTHER CARD LOOK FASTER.

Guess what people, BOTH cards perform VERY well. They are close enough that you could buy either/or and be quite happy with your purchase.

Kudos to Nvidia for releasing a really good product, and kudos to ATI for maintaining a solid product line. Personally I'm waiting for the generation after this one before I kick my 9700pro to the curb :)

No, no no....We CAN'T all be friends!!!
 
The title of this thread is PS 3.0 Important now?

It's not a thread about whether someone is or is not a fanboy. Keep all of the posts in this forum focused on the hardware and it's function, not each other. If you are not here to discuss the posted topic, do not post here.
 
joobjoob said:
try to keep in mind that in a little over 12 months DX10 parts will be out. and this focus on 9.0b vs 9.0c will be left in the dust.

whatever features you get today will be performance obsolete if you are serious gamer.

if your buying a video card to keep for a few years,, id find a real nice and quite one

I think DX10 will still be a long way off. Keeping this in mind, then you will stop yourself from upgrading :p rite?
I cannot say how good is SM3.0, but when the 9700 came out with the first DX9 support, it blew the competition away. So thats it.
SLI, only available for the PCI-E cards? Maybe we can get two AGP slots.
 
While the importance of SM3.0 might be questoinable in the IQ department, it'll definatley start making a difference in the speed department.

As early benchmarks are showing, SM3.0 is giving the 6800's a decent boost in performance as well as maintaining the quality we expect from SM2.0.

Will we see more games coming out with SM3.0 used as an optimization technique? Who knows. It'll be interesting to see how this all plays out.
 
Hitokiri Batohsai said:
I think DX10 will still be a long way off. Keeping this in mind, then you will stop yourself from upgrading :p rite?
I cannot say how good is SM3.0, but when the 9700 came out with the first DX9 support, it blew the competition away. So thats it.
SLI, only available for the PCI-E cards? Maybe we can get two AGP slots.

2 AGP slots is not possible.
 
If it were my $500 I would get a 6800GT once they hit $299. Then wait it out too see what else if anything comes out in the next 6-9 months. I am sure there is going to be something brand new from ATi that will probably be supercharged, maybe like 32 pipelines? ;)
After those monsters come out then I would sell my 6800GT and take the left over $150 to buy one of those monsters. I don't think it's wise to buy cutting edge stuff anymore. I picked up a FireGL x1 for $89 on ebay, it's just a radeon 9700 Pro with dual DVI and it's very fast for me. I admit, I am not a hardcore gamer, but from all the reviews the 9700pro seems to do well enough. You get my idea though.
 
From Tom's hardware
"NVIDIA and Crytek have shown that SM 3.0 can indeed lead to performance improvements over an SM 2.0 implementation. This conclusion flies in the face of what ATi has been preaching for the last few months. In many of our FarCry tests, the NVIDIA cards can now pull ahead of their rivals from ATi. The GeForce 6800 GT makes an especially convincing appearance here, especially considering that, at $399, it costs just as much as its direct competitor the X800 Pro."
far%20cry.jpg
 
you guessed it :p Now i wonder how 3Dc will fair when toms reviews that I wonder?Yes SM3 will do something just as 3Dc will.But no one knows until the Farcry1.3 patch is out and Both cards are running there respective technologys!What i am intrested in is to see which card takes the largest hit running there new techs?
 
scott122 said:
From Tom's hardware


I can already perdict the ATI folks here "Tom's Hardware is not valid" :rolleyes:

Believe me it doesn't take an ati fan to call Tomshardware not valid, it's a pretty known assessment across the forums.

Look at Tom's results compared to everyone else in the world, lol.
Anandtech

1600x1200 -- 6800U Extreme sm3.0 vs. X800 XT PE

mp_airstrip 4x/8x AA/AF -- 51.8fps vs 59.40fps ATI Winner
mp_mangoriver 4x/8x AA/AF -- 49.9fps vs 55.9fps ATI Winner
Research 4x/8x AA/AF -- 50.7fps vs 49.3fps *Nvidia Winner
Training 4x/8x AA/AF -- 47.8fps vs. 48.9fps *ATI Winner
Regulator 4x/8x AA/AF -- 43.3fps vs. 38.9fps Nvidia Winner
Volcano 4x/8x AA/AF -- 50.7fps vs. 45.4fps Nvidia Winner

No 1024x768 numbers available

To summarize, for the 4x/8x tests, both cards are tied with 3 wins a piece. Also keep in mind that its a 6800UE not a 6800u that is being compared here.

Result: Tie


TechReport

1600x1200 -- 6800U sm3.0 vs. X800 XT PE

Research 4x/8x AA/AF -- 50.71fps vs 52.40fps *ATI Winner
Training 4x/8x AA/AF -- 39.49fps vs. 48.24fps ATI Winner
Regulator 4x/8x AA/AF -- 37.75fps vs. 38.33fps *ATI Winner
Volcano 4x/8x AA/AF -- 52.74fps vs. 51.78fps *Nvidia Winner

1024x768 -- 6800U sm3.0 vs. X800 XT PE

Research 4x/8x AA/AF -- 97.88fps vs 107.17fps ATI Winner
Training 4x/8x AA/AF -- 64.24fps vs. 62.25fps *Nvidia Winner
Regulator 4x/8x AA/AF -- 59.59fps vs. 53.30fps Nvidia Winner
Volcano 4x/8x AA/AF -- 102.93fps vs. 105.33fps ATI Winner

As you can see, on the surface both cards and split evenly at 1024. However at 1600, ATI wins 3/4 times.

Result: ATI Winner

FiringSquad

1600x1200 -- 6800U Extreme sm3.0 vs. X800 XT PE

monkeybay 4x/8x AA/AF -- 49.5fps vs. 50.6fps *ATI Winner
Research 4x/8x AA/AF -- 44.8fps vs. 51.3fps ATI Winner
Training 4x/8x AA/AF -- 42.2fps vs. 50.1fps ATI Winner
Regulator 4x/8x AA/AF -- 38.5fps vs. 40.4fps *ATI Winner
Volcano 4x/8x AA/AF -- 45.2fps vs. 47.8fps *ATI Winner

1024x768 -- 6800U Extreme sm3.0 vs. X800 XT PE

monkeybay 4x/8x AA/AF -- 84.7fps vs 85.1fps *ATI Winner
Research 4x/8x AA/AF -- 83.7fps vs. 108.4.3fps ATI Winner
Training 4x/8x AA/AF -- 80.5fps vs. 95.0fps ATI Winner
Regulator 4x/8x AA/AF -- 38.5fps vs. 40.4fps *ATI Winner
Volcano 4x/8x AA/AF -- 86.2fps vs 97.2fps ATI Winner

Are these numbers right?? me thinks somethings wrong here.

Result: ATI Winner

***ADDED 07/07/2004 11:04AM EST

Xbit Labs

1600x1200 -- 6800U sm3.0 vs. X800 XT PE

Pier 4x/16x AA/AF -- 41.4fps vs. 49.6fps ATI Winner
Catacombs 4x/16x AA/AF -- 67.6.4fps vs. 75.9fps ATI Winner
Research 4x/16x AA/AF -- 50.9fps vs. 59.5fps ATI Winner
Training 4x/16x AA/AF -- 50.4fps vs. 60.9fps ATI Winner
Regulator 4x/16x AA/AF -- 40.1.5fps vs. 43.6fps *ATI Winner
Volcano 4x/16x AA/AF -- 50.8.2fps vs. 53.2fps ATI Winner

1024x768 -- 6800U sm3.0 vs. X800 XT PE

Pier 4x/16x AA/AF -- 77.7fps vs. 84.6fps ATI Winner
Catacombs 4x/16x AA/AF -- 126.2fps vs. 124.1fps *Nvidia Winner
Research 4x/16x AA/AF -- 95.7fps vs. 104.6.3fps ATI Winner
Training 4x/16x AA/AF -- 105.7fps vs. 126.1fps ATI Winner
Regulator 4x/16x AA/AF -- 75.4fps vs. 71.2fps Nvidia Winner
Volcano 4x/16x AA/AF -- 95.3fps vs 106.3fps ATI Winner

Results are almost on par with Firingsquad, with ATI dominating.

Result: ATI Winner


***ADDED 07/07/2004 1:36AM EST


Hot Hardware

1600x1200 -- 6800GT sm3.0 vs. X800 Pro

Research 4x/16x AA/AF -- 32.26fps vs. 35.24fps ATI Winner
Training 4x/16x AA/AF -- 29.33fps vs. 35.41fps ATI Winner
Regulator 4x/16x AA/AF -- 26.45fps vs. 28.52fps *ATI Winner
Volcano 4x/16x AA/AF -- 33.95fps vs. 34.79fps *ATI Winner

1024x768 -- 6800GT sm3.0 vs. X800 Pro

Research 4x/16x AA/AF -- 57.45fps vs. 75.25fps ATI Winner
Training 4x/16x AA/AF -- 51.49fps vs. 60.87fps ATI Winner
Regulator 4x/16x AA/AF -- 46.92fps vs. 48.45fps *ATI Winner
Volcano 4x/16x AA/AF -- 60.63fps vs 70.54fps ATI Winner

Here are the results for the first GT/Pro comparison. At the end of the day, and at the highest resolutions, both cards are equally fast. Unfortunately I cannot say the same at a lower res.

Result: ATI Winner

***ADDED 08/07/2004 7:55AM EST


TomsHardware

1280x1024 -- 6800U sm3.0 vs. X800 XT PE

Cooler 4x/4x AA/AF -- 54.8fps vs. 51.8fps *Nvidia Winner
Research 4x/4x AA/AF -- 64.7fps vs. 63.7fps *Nvidia Winner
Training 4x/4x AA/AF -- 60.7fps vs. 63.9fps ATI Winner
Regulator 4x/4x AA/AF -- 52.8fps vs. 51.9fps *Nvidia Winner
Volcano 4x/4x AA/AF -- 63.8fps vs. 63.1fps *Nvidia Winner


1024x768 -- 6800U sm3.0 vs. X800 XT PE

Cooler 4x/4x AA/AF -- 78.4fps vs. 75fps *Nvidia Winner
Research 4x/4x AA/AF -- 92.7fps vs. 94.4fps *Nvidia Winner
Training 4x/4x AA/AF -- 78.3fps vs. 78.4fps *ATI Winner
Regulator 4x/4x AA/AF -- 75.9fps vs. 70.2fps Nvidia Winner
Volcano 4x/4x AA/AF -- 91.6fps vs 86.9fps Nvidia Winner
 
Terra said:
The 1.3 patch for FarCry is gonna require 32bit blending for the new effects put into that patch ;)
Where does that leave ATi? :)

Terra...

That still leaves nVidia in the pants of CryTek. I seem to remember a little slip of paper in my FarCry box. Just wondering why nobody mentions this. My FarCry box had a fucking nVidia ad in it. You are seriously trying to tell me that FarCry is completly non-nVidia biased? Somebody got some money somewhere for putting that little "They way it was meant to be played" ad in there, and if you ask me, it is showing.
:rolleyes:
 
nweibley said:
That still leaves nVidia in the pants of CryTek. I seem to remember a little slip of paper in my FarCry box. Just wondering why nobody mentions this. My FarCry box had a fucking nVidia ad in it. You are seriously trying to tell me that FarCry is completly non-nVidia biased? Somebody got some money somewhere for putting that little "They way it was meant to be played" ad in there, and if you ask me, it is showing.
:rolleyes:

That's the bad part about all of this, now Ati will start dumping money to get ATi exclusive effects into games, and before it's over the game will look one way on one card and a different way on another. It's no longer a battle of speed or IQ, its a battle of who give more money to the game developers.
 
@trapine said:
What the hell are you smokin bad_boy :rolleyes: You have to go into a DRAW^ just to be the lucky one to buy one=NOT RETAIL?=you aint likely to ever get your hands on the damed thing?
The rest speaks for its self?
The only 2 cards in any real circulation are the 6800GT and X800Pro?Now what you choose is up to you?
both are good cards both have got there strong points I>E X800=3Dc, 6800GT=SM3
(May the force be with you :p)
Unless I'm forgetting about it, I never said the 6800 Extreme Edition was easy to obtain, unless you have a quote to prove that I did in which case you are welcome to post.

Anyways, to get myself back on topic.... Do I think its important now? No, because it can't be important at the time when we dont have it. There is really no games that have it at this specific time and moment. And until then, I'll wait until it shows its importance. You can't really say something is important when its not there. For example, flying cars are not important, because we dont have them. But when we do get them, they should show their importantance when needed (ie. faster travel). If that made any sense lol.
 
creedAMD said:
That's the bad part about all of this, now Ati will start dumping money to get ATi exclusive effects into games, and before it's over the game will look one way on one card and a different way on another. It's no longer a battle of speed or IQ, its a battle of who give more money to the game developers.



That's true, but surely ATI and especially nVidia (with chipsets to also worry about), can't afford to dump money into every game developer.
 
PVS 3.0 *might* speed up very complex DX9.0c games, but possibly not enough to topple ATi.

So far with the PVS 3.0 code in Farcry, the 6800ultra is still playing catchup to the higher clocked X800XT. In PVS 2.0 DX9.0 the X800XT is still a fair chunk faster than the 6800ultra. I would have though that if Nvidia wanted FarCry as their showcase piece, they would have waited to release a patch that would have definitively put it in first place.

"Many hands make light work" is a perfectly appropriate saying here. ATi has a card that does shorter instructions but does them faster (for example: a handshovel). Nvidia has a card that does longer instructions but does them slower (a footshovel). At the end of the day, on long PVS 3.0 instructions (deep holes) Nvidia and ATi are about tied for speed, but on short instructions Nvidia is slower because it still has to go through the long instruction pipe (why dig a deep hole when you only need a small one)...

So *puts on flamesuit* ATi's decision to stay with 2.0 may be better. (Nvidia CEO must be wondering where the performance has disappeared in the 60 million extra transistors)
 
jarman said:
That's true, but surely ATI and especially nVidia (with chipsets to also worry about), can't afford to dump money into every game developer.

much like nvidia sponsoring/being sponsored by games?

"Nvidia. The way its meant to be played"
 
Back
Top