New 2900XT benchmark (CF included) _official

andy_tok

n00b
Joined
Apr 17, 2007
Messages
14
http://bbs.ocfantasy.com/attachments/1_ALpYcLeooTo7.jpg
[IMG]http://bbs.ocfantasy.com/attachments/2_znTgMOzZDkPn.jpg
[IMG]http://bbs.ocfantasy.com/attachments/3_D58D3FjroheL.jpg
[IMG]http://bbs.ocfantasy.com/attachments/4_RimY857rCVoq.jpg

get your own bandwidth - [COLOR="Red"]Dr.Evil[/COLOR]

from bbs.ocfantasy.com
 
Going to the website is worthless cause its all in chinese, however this is interesting, especially the DX10 stuff. The note at the end about Physics based particles is interesting as well, but its still too early to see the truth through all the smoke and mirrors.
 
Yeah I tried that, but it still hard to read stuff like "AMD ah, X2 can easily get rid of meat? "

what does that mean? lol
lol,“meat” is a common alias for Conroe/Core 2 in China.
He/She was saying that X2 is better than Core 2 in some cases;)
 
I'm a little skeptical of that, especially the last part there. I believe the marks will be similar, at least for the first few benchmarks. We'll see though! Thanks for posting, I'm getting pumped about going Red before to long.
 
If these numbers are correct, I think its obvious this GPU was made for DX10. DX9 numbers look average, not much faster than R580.
 
lol @ 8800gts...which one?

Anyway, the CF scores are impressive, but notice no GTX scores to compare it with...
 
I don't know if these are legit or not, but something in the comments translated from Chinese just plain frightens me..

In addition, the ancient remains 4X series diesel costs ...

Apparently, the R600 series uses 4 times the diesel fuel as the 8800 series. That is going to be a real problem with gas prices going up and possible disruptions in refining due to political instability in the middle east. Also, it is just plain bad for the environment.

Not to mention that my PC's power supply isn't set up to take diesel, only regular unleaded! I'm going to have to lower the overclock on my meat or something.

Also, don't those call of Juarez numbers seem sort of low for that resolution?
 
I don't know if these are legit or not, but something in the comments translated from Chinese just plain frightens me..



Apparently, the R600 series uses 4 times the diesel fuel as the 8800 series. That is going to be a real problem with gas prices going up possible disruptions in refining due to political instability in the middle east. Also, it is just plain bad for the environment.

Not to mention that my PC's power supply isn't set up to take diesel, only regular unleaded! I'm going to have to lower the overclock on my meat or something.

Also, don't those call of Juarez numbers seem sort of low for that resolution?

LMAO. Classic.
The american version will have an adapter to allow the use of regular unleaded. And just so you are aware, overclocking your meat is illegal in 31 states.

And these benchmarks are obviously erroneous. How can two cards that score so closely in 3dmark have such different results in real games? /sarcasm
 
lol @ 8800gts...which one?

Anyway, the CF scores are impressive, but notice no GTX scores to compare it with...

I think that is because "supposedly" the xt will be competing with the gts, while the xtx will be competing with the gtx, "supposedly".. We will see what actual retail price is when they hit the shelves...
The numbers seem good, but I'm waiting for some of the sites I trust to put their reviews up prior to making any further comments..
 
hmm by overclocking meat do they mean viagra :)

its a gas guzzling viagra pimping machine!
 
hmm by overclocking meat do they mean viagra :)

its a gas guzzling viagra pimping machine!
I hate to say this, but "guzzling" usually has a different connotation when used in sentences with Viagra... :( :eek: This concludes today's lesson in the inner workings of a college male's mind.
 
Crossfire almost doubling the SLi numbers in most games and especially in OPENGL games... You have to kidding me :rolleyes: Even if it is an 8800 320Mb (assuming) they measured against, its still seriously erroneous looking.
 
Man CF looks so good... too bad I have yet to see a good MB that supports it.
Maybe I am wrong...
 
Anyway, the CF scores are impressive, but notice no GTX scores to compare it with...
The 2900XT is not meant to challenge the GTX and I believe that is why it was excluded from the benchmarking. The XTX model is intended to compete with the GTX but it has been delayed to at least Q3. Hopefully it will benefit from a die shrink and the accompanying clockspeed increase it will engender, because it will be dealing with more than the GTX by then. I could wish for additional TMUs but that is fanciful dreaming for this year.

If these numbers are correct, I think its obvious this GPU was made for DX10. DX9 numbers look average, not much faster than R580.
TMK, DX10 titles won't be available before fall. Thus, any architectural optimizations in favor of DX10 won't be realized until then. By that time however, competitors with likely improved optimizations of their own will have entered the market. I don't think ATI is going to benefit much from any advantage in DX10 when the market actually sees the titles available on the shelves, whatever advantage it may theoretically possess at this juncture. ATI needs to time the release of their products according to market realities.
 
I don't know if these are legit or not, but something in the comments translated from Chinese just plain frightens me..



Apparently, the R600 series uses 4 times the diesel fuel as the 8800 series. That is going to be a real problem with gas prices going up possible disruptions in refining due to political instability in the middle east. Also, it is just plain bad for the environment.
lol, funny translation.
"ancient remains 4X series"= "Elder Scrolls 4:Oblivion"
"diesel costs"="graphics-hungry"
The comment was referring to the low FPS in Elder Scrolls 4:Oblivion...
:p :D
 
The 2900XT is not meant to challenge the GTX and I believe that is why it was excluded from the benchmarking. The XTX model is intended to compete with the GTX but it has been delayed to at least Q3. Hopefully it will benefit from a die shrink and the accompanying clockspeed increase it will engender, because it will be dealing with more than the GTX by then. I could wish for additional TMUs but that is fanciful dreaming for this year.

TMK, DX10 titles won't be available before fall. Thus, any architectural optimizations in favor of DX10 won't be realized until then. By that time however, competitors with likely improved optimizations of their own will have entered the market. I don't think ATI is going to benefit much from any advantage in DX10 when the market actually sees the titles available on the shelves, whatever advantage it may theoretically possess at this juncture. ATI needs to time the release of their products according to market realities.

Good points here. In my opinion, these benchmarks are only really relevent to people who will be in the market for an upgrade for DX9 games in May. Most people are gearing up for DX10 games, although there are people who are looking for a boost in Oblivion, Supreme Commander, Vegas, ect. Everyone wants to know if they can have their cake (DX9 performance today) and eat it too (DX10 performance tomorrow) but you don't really know until the games come out or we get some kind of demo.

What we all really want to know is what resolutions and settings these cards can handle in Crysis, UT3, Bioshock, Episode 2, Team Fortress 2, ect. I think a lot of the anxiety is driven by the investments people have made in LCD monitors.

I've come to the conclusion that at 1680 x 1050, which is the native resolution of my monitor, I really need 4xaa to get an optimal experience. Also, I want 50+ fps. I can't drop the resolution or I loose the benefit of running at a native resolution, and I don't want to go under 4x aa because of the jaggies. In the old days of CRT monitors you had a real range of resolutions to play with, but this is the HD generation and those of us with HD capable displays want the native resolution.

A few generations ago if you said you wanted 1600 x 1200 4xaa 8xaf you were basically asking for the moon and had to pay accordingly. Now we really need that performance for HD displays. I really hope I don't need crossfire to power a freakin 20.1inch widescreen LCD. I'll skip a whole generation of cards and just wait to play those games if I have to.
 
That fact that there are DX10 games coming out within the year is enough to steer people in purchasing a better performing DX10 GPU, that is if the R600 is significantly faster in such environments. If ATI/AMD can prove that the R600 is significantly more 'future proof' than the G80 series, then it is definitely an "advantage".
 
I've come to the conclusion that at 1680 x 1050, which is the native resolution of my monitor, I really need 4xaa to get an optimal experience. Also, I want 50+ fps. I can't drop the resolution or I loose the benefit of running at a native resolution, and I don't want to go under 4x aa because of the jaggies. In the old days of CRT monitors you had a real range of resolutions to play with, but this is the HD generation and those of us with HD capable displays want the native resolution.

I'll skip a whole generation of cards and just wait to play those games if I have to.

I agree 100%.
I don’t have the best computer right now but I have used 1680 x 1050 with 4 x AA, 16 x AF and all the other bells turned up to full and gaming at anything less then that is just not good enough anymore.
Sadly since my computer can’t handle today’s Super busy games I am playing less and less with max settings just to increase FPS
 
LOL at 8xAA being "max image quality" - I play 1600x1200 with 8xAF and 16xAA, and R600 had better go higher than that. 8xAA is OK, 4x is acceptable, anything lower means "wait for better hardware to come out"
 
LOL at 8xAA being "max image quality" - I play 1600x1200 with 8xAF and 16xAA, and R600 had better go higher than that. 8xAA is OK, 4x is acceptable, anything lower means "wait for better hardware to come out"
The R600 is rumored to be capable of 24xAA, but no idea at what level of performance, though.
 
The R600 is rumored to be capable of 24xAA, but no idea at what level of performance, though.

Why is AA x 24 even needed?

Why are they not making Video cards that are natively at 8 x AA as a default. Further to that point, why are games not coded better to reduce the need for AA.
 
Why is AA x 24 even needed?

Why are they not making Video cards that are natively at 8 x AA as a default. Further to that point, why are games not coded better to reduce the need for AA.

I dont see the need for 24x AA either. I find 4x gets shut of the jaggies perfectly. 24x would only look better in screenshots.

But, how do you code a game better to reduce the need for AA? I dont see how you could. You only really see jaggies on contrasting colours. The only way i can see you get around needing AA would be to have an insanely high resolution and very small dot pitch on the monitor.:confused:
 
How do you suppose they do that?

I'm not sure.. better shading?
All I know is that some games you can really notice jaggies and really need to turn the AA up and some games jaggies are not so bad.
I figured that it came down in part to the game coding.
 
Well everyone has their own subjective standards with respect to what good image quality is. For me it is a question of reducing jaggies to the point where they really aren't a factor and you can appreciate the art style of the particular game. I find 4x is the point where I am no longer distracted by the jaggies and start really admiring the more subtle details in the game. I found this was especially true with Half Life2: Episode One and Battlefield 2.

I don't remember the last time a "next gen card" really made a "next gen game" look all that great. I remember the 9700pro was just what the doctor ordered for BF1942 but couldn't quite cut it at high resolutions with aa. Same goes for the 9800pro with Doom3 and Half Life2.

It seems like the cards that can run games at the settings that really make them, subjectively, look really good come out after the games are released, and almost never before. (at least games based on fresh "engines") Considering all the DX10 games that are coming out at the end of the year and through 2008, I think upgrading in May is jumping the gun, especially if you are a student like me and have to scrape funds together for hardware.

My biggest complaint about PC gaming is that I am almost never have the optimal hardware to play a next gen game the first time I play it. By the time there is affordable hardware that makes a game look good, I've already played the game!
 
The numbers simply don't look right, XT is ahead of GTS but not THAT much. 3X in HL2 prison just doesn't look possible. And GTS shouldn't get such abysmal scores in the DX10's either. There no way NV would let their first DX10 part do this bad otherwise come Crisis 8800 sales are going to dive face first into the ground.

About the gas thing:
上古4X系列依然是'费'柴
The guy had a typo (Easy to happen for people who only know SC typing on a forum). The guy was trying to say that it was crap since fey-chai means useless fire wood, a slang for crap.

As for the meat:
Con-roe pronounces exactly like soy sauce meat stew in Mandarin hence often referred to that way by people there.
 
Well the GTS is on the market right now, so it should be a simple matter to confirm whether the GTS numbers are reasonable or not. I'm assuming that this is the 640MB gts on these benchmarks. I think we can make that assumption because in these charts the GTS does prevail in a few benchmarks and that could be attributed to the extra memory. I would expect the extra memory to be a factor at some point.

On the other hand if this were the smaller GTS 320MB I would expect the 2900XT to beat it in every benchmark.

Edit: Would anyone like to speculate as to why the GTS would beat the 2900XT in Serious Sam2 by a large margin, but the 2900XT beats the GTS in Far Cry? Strange.
 
Because all these "benchmarks" were released before May 2nd, im keeping my judgments until then.
 
[IG]http://bbs.ocfantasy.com/attachments/1_ALpYcLeooTo7.jpg[/IMG]
[IG]http://bbs.ocfantasy.com/attachments/2_znTgMOzZDkPn.jpg[/IMG]
[IG]http://bbs.ocfantasy.com/attachments/3_D58D3FjroheL.jpg[/IMG]
[IG]http://bbs.ocfantasy.com/attachments/4_RimY857rCVoq.jpg[/IMG]

from bbs.ocfantasy.com

Don't steal bandwidth.

Because all these "benchmarks" were released before May 2nd, im keeping my judgments until then.

Agreed.
 
Back
Top