New 2900XT benchmark (CF included) _official

what are the core/mem clocks?
From what I've seen so far it takes higher clocks on the R600.
Never mind found the source here




 
again, single against single is NOT impressive. i might just completely skip this gen out of frustration and just a get a x1950 pro to hold me over.
 
what are the core/mem clocks?
From what I've seen so far it takes higher clocks on the R600.
Never mind found the source here







based on those specs especially the 47 Giga pixels/sec the 2900 XT should be completely owning the 8800 GTX. also, if its supposed to be unfiied, why list number of texture units as 16? am i missing something?


***edit:
X2900 XT/XTX = 16 ROPS, 16 Texture Units
8800 GTX = 24 ROPS, 32 Texture Units

no wonder the R600 stinks.
 
Hardly official. And already posted several times... please look before posting. ;)
 
Lets not forget that the 2900XT is not the top of the line compared to the 8800GTS or GTX or G7O or whatever. (I'm a little tired.) First off we have absolutely any idea if this is official, second drivers are still being optimized and three no one has even gotten a review in this to even start talking smack or pointing fingers. Lets be patient and see what November has to offer us and if the R600's are faster and better than the G80's than so be it.

Meditate with me; Ohhhhmmmmmm. Ohhhhhhmmmm
 
Lets not forget that the 2900XT is not the top of the line compared to the 8800GTS or GTX or G7O or whatever. (I'm a little tired.) First off we have absolutely any idea if this is official, second drivers are still being optimized and three no one has even gotten a review in this to even start talking smack or pointing fingers. Lets be patient and see what November has to offer us and if the R600's are faster and better than the G80's than so be it.

Meditate with me; Ohhhhmmmmmm. Ohhhhhhmmmm

If I meditate hard enough, will your subliminal advertising overwhelm me?
 
Lets not forget that the 2900XT is not the top of the line compared to the 8800GTS or GTX or G7O or whatever. (I'm a little tired.) First off we have absolutely any idea if this is official, second drivers are still being optimized and three no one has even gotten a review in this to even start talking smack or pointing fingers. Lets be patient and see what November has to offer us and if the R600's are faster and better than the G80's than so be it.

Meditate with me; Ohhhhmmmmmm. Ohhhhhhmmmm

HAHAHHAHA, I've linked a few friends to look VERY closely at your post :D

On track though: If all these 'official' review pan out... I'm very underwhelmed so far... I'm getting sick of the RMA game with my 7800GTX OC and I can off it to a friend who doesn't care about fan noise and snatch some mid-range DX10 card and change to Vista without too much trouble. At least ATI's Vista drivers aren't crap... I was hoping their DX10 hardware wouldn't be too...
 
HAHAHHAHA, I've linked a few friends to look VERY closely at your post :D

On track though: If all these 'official' review pan out... I'm very underwhelmed so far... I'm getting sick of the RMA game with my 7800GTX OC and I can off it to a friend who doesn't care about fan noise and snatch some mid-range DX10 card and change to Vista without too much trouble. At least ATI's Vista drivers aren't crap... I was hoping their DX10 hardware wouldn't be too...

Unfortunately I also like to include my own subliminal advertising. ;)
 
Unfortunately I also like to include my own subliminal advertising. ;)

I had some friends who are members here quote your post in a message to see if they could figure it out because they couldn't find it otherwise, and one guy said it looked like a penis and balls... wtf?
 
I had some friends who are members here quote your post in a message to see if they could figure it out because they couldn't find it otherwise, and one guy said it looked like a penis and balls... wtf?

lol, ok that is just weird. I've started putting small minimalistic messages in my posts just like Steve does. Unfortunately, I can't resist anymore even in this post. :p It is fun though! :D
 
lol, ok that is just weird. I've started putting small minimalistic messages in my posts just like Steve does. Unfortunately, I can't resist anymore even in this post. :p It is fun though! :D

again, nice.
 
Yeah, hopefully if the drivers are optimized correctly (which they should be) then we'll see ATI win against Nvidia. It works like a pendulum back and forth back and forth.

Well it took them long enough to get the cards out, I hope the drivers work fully from the get-go; none of this nVidia non-SLI support bullcrap.
 
LOL at 8xAA being "max image quality" - I play 1600x1200 with 8xAF and 16xAA, and R600 had better go higher than that. 8xAA is OK, 4x is acceptable, anything lower means "wait for better hardware to come out"

When u have a 30" high res monitor u don't need 8xAA to play
 
Eh? The larger the screen the greater the need for AA, since the image is clearer.

The large screen arguement is bogus, as follows:

In terms of resolution, no 30 incher is going to be much better in terms of pixels per in than my 20.1 in 1600x1200:

20.1 in diagonal (4x3) :
1600 pixel width over 16.08 in gives pixel width 0.01005 in
1200 pixel height over 12.06 in gives height 0.01005 in

30 in diagonal (16x10):
25.44 in width so for same pixel width horizontal res would need to be at least 2531
15.9 in height so for same pixel height vertical res would need to be 1582

Taking the Apple 30 incher as an example, the res is 2560x1600 - this is barely better in terms of pixel size than my screen, which means any section the same size as my screen will look exactly the same as my screen does in itself, and any aliasing will be almost exactly as noticeable. I use 16xAA when possible on this screen, 8x when not feasible, and 4x as an absolute minimum, and would use exactly the same on such a screen.
 
For LCD's yeah thats true to some degree, CRT's not really, the increase in res can offset the amount of AA needed (smaller pixels)
 
from www.chiphell.com

20070429_d6c8c8d9a24fc11d6d803JKn6i0bUz9q.jpg


http://translate.google.com/translate?u=http%3A%2F%2Fwww.chiphell.com%2Fviewthread.php%3Ftid%3D3407%26extra%3Dpage%253D1%26page%3D10&langpair=zh%7Cen&hl=en&newwindow=1&ie=UTF-8&oe=UTF-8&prev=%2Flanguage_tools
 
nice scores.

A cpu of 1600mhz gets 2110 for cpu marks? A stock E6600 at 2.40ghz gets roughly the same. Odd.

What drivers are these being used and is the card overclocked at all?
 
Who plays what game with 8xaa and since with what card? Since when has 8xaa been playable doesn't that bring most cards to a crawl?
 
all default.:p
8.37 beta drive i think

cool thanks. If this truly is the XT and not the XTX I am feeling more confident about this card than I was last week. My cpu score with my overclocked 4800+ is 2165 so I should be able to expect similar 06 scores if that is the case. I just hope game performance translates the same. Fear MUST run A LOT better than my X1900XTX for me to buy it.
 
Lets not forget that the 2900XT is not the top of the line compared to the 8800GTS or GTX or G7O or whatever. (I'm a little tired.) First off we have absolutely any idea if this is official, second drivers are still being optimized and three no one has even gotten a review in this to even start talking smack or pointing fingers. Lets be patient and see what November has to offer us and if the R600's are faster and better than the G80's than so be it.

Meditate with me; Ohhhhmmmmmm. Ohhhhhhmmmm

porn bold?
 
really hoping the Radeon 2900 Series are better or ATi is going to have trouble!
 
Well it's not much of an incentive to upgrade is it! They'd best have some working CF drivers at launch otherwise there's not much point for me
 
Well it's not much of an incentive to upgrade is it! They'd best have some working CF drivers at launch otherwise there's not much point for me
So... double the speed isn't much of an incentive to upgrade...
What did you want, Jesus on PCB?
Regardless, 3dPenisMark means jack + shit.

Let's see what they do in real games, after NDA is open, by reputable reviewers.

Honestly, I don't need anything faster than my 1900xtx or 1950pro for anything on the machines I've got.

New games may change that.
 
Back
Top