Radeon HD 2900 XTX: Doomed from the Start

http://dailytech.com/article.aspx?newsid=7079&commentid=132926&threshhold=1&red=4432#comments

RE: Are you sure the XTX you have is the retail version?
By KristopherKubicki (blog) on 4/27/07, Rating: 6
By KristopherKubicki (blog) on 4/27/2007 12:35:32 AM , Rating: 6
I was curious about the lack of HDMI as well, though one partner told me on the XT the only HDMI capabilities would be via a dongle (so no sound obviously).

Someone "reached out and touched us" today and gave us 2 Radeon XT cards for use in Crossfire and a different driver (350MB ?!?!). Expect benchmarks this week.

Some thing is fishy about those drivers.
 
From that forum

Bum_JCRules

I really would love to comment on this stuff...

I understand that DT and Anand are seperate but that is so childish. Derek was there and his cards got to his place of business before he returned home from Tunisia. That long board they have ... Not what Derek should have gotten in his delivery. That is all I will say before I go too far.

you guys going apeshit about these benchies are being silly.
 
Kombatant, an employee of AMD/ATI, has also said that the benchmarks are bunk and that the card Daily Tech tested was not the XTX, but rather an OEM XTX. He also said earlier on that the card will be well worth the wait, and hinted at some sort of surprise for ATI faithful. Not too sure what that is supposed to mean. ??
 
Sweet, looks like things are taking a turn for the better. Well, more specifically they were the same all along, but now the rumors are taking a turn for the better. :)

Loafer said:
Kombatant, an employee of AMD/ATI, has also said that the benchmarks are bunk and that the card Daily Tech tested was not the XTX, but rather an OEM XTX. He also said earlier on that the card will be well worth the wait, and hinted at some sort of surprise for ATI faithful. Not too sure what that is supposed to mean. ??

Hopefully all of the advance disappointment surrounding the 2900 XTX will turn out to be unfounded!
 
I bet they paid to can these benchmarks...

Make everyone dissapointed and upset over the card, then deliver something that beats the current 8800's by 20% or so. A ploy to bolster everyone's opinion of ATI :p

The less you expect, the better it'll be.
 
well OEM cards usually always slower than the retail cards. guess well see in a few days or weeks depending on what ATI will do.
 
hopefully this is just a marketing stunt cuz i dont remember ever before someone being as dramatic like saying DOOMED blah blah before a card is even released.
 
Kombatant, an employee of AMD/ATI, has also said that the benchmarks are bunk and that the card Daily Tech tested was not the XTX, but rather an OEM XTX. He also said earlier on that the card will be well worth the wait, and hinted at some sort of surprise for ATI faithful. Not too sure what that is supposed to mean. ??

Maybe the surprise is another Valve game tie-in. After paying so much for a top-of-the-line card it wouldn't hurt to throw in a free game.
 
Maybe the surprise is another Valve game tie-in. After paying so much for a top-of-the-line card it wouldn't hurt to throw in a free game.
HL2: Ep2?! :eek: :D

too bad you won't be able to buy one on May 2nd, would be a wonderful bday present ;).
 
Kombatant, an employee of AMD/ATI, has also said that the benchmarks are bunk and that the card Daily Tech tested was not the XTX, but rather an OEM XTX. He also said earlier on that the card will be well worth the wait, and hinted at some sort of surprise for ATI faithful. Not too sure what that is supposed to mean. ??
ATI faithful you say? Could they mean a HD AIW version?? That would make some sense considering the touted integrated HDMI sound.
 
Kombatant, an employee of AMD/ATI, has also said that the benchmarks are bunk and that the card Daily Tech tested was not the XTX, but rather an OEM XTX. He also said earlier on that the card will be well worth the wait, and hinted at some sort of surprise for ATI faithful. Not too sure what that is supposed to mean. ??
Surprise?
I'm with Apollo...maybe they lied about the AIW series getting cancelled? I've owned 3 of these cards and would love to see a 4th installed in my box.
Hmmm... AIW 2900 at XT speeds, a theater 650 chip, Dual Link DVI, HDMI out...I can only dream I guess.

Here is something else that Kombatant said over at Rage3D:
Kombatant said:
They christened the OEM version we've known and loved for quite a few months as an "XTX". That should tell you a lot about their credibility actually.
LOOK VERY CLOSELY AT THE CORE AND MEMORY CLOCK SPEEDS.
AMD ATI Radeon HD 2900 XT (745 MHz core, 800 MHz GDDR3)
AMD ATI Radeon HD 2900 XTX (750 MHz core, 1010 MHz GDDR4)

The core of the XTX is only 5, yes 5, mhz faster. Anyone else see something wrong with that?
What were the rumoured speeds again?
I've heard a lot of them around 800+ mhz core and 1.1ghz (2.2 effective)mem.
~~~~~~~~~~~~~~~~~~~~~~~~
This is just a theory...

WHAT IF... The XTX DailyTech tested was actually the next FIREGL card? What if all these "OEM" cards floating around are the FireGL cards?

The FireGL Workstation cards are always clocked lower than their gaming counterparts. They are the exact same chip as their gaming counterparts, with different CMOS to optimize them for rendering (hence why you can flash a normal card to a FireGL). Since they are usually rendering for a much longer period than a Gaming card could ever hope for, their clocks are reduced to make sure they can stand the heat for a longer period, and a larger cooler and fan is added which also doubles as a brace for workstation cases.
Example:
Radeon X1900XT 512MB GDDR3
Core Clock: 625mhz
Mem Clock: 725mhz (1.45ghz effective)
FireGL V7300 512MB GDDR3
Core Clock: 600mhz
Mem Clock: 650mhz (1.3ghz effective)
Also, the "OEM" cards floating around are 12" Workstation Form Factor, most workstations I've seen using graphics cards that long were GRAPHICS workstations.
wink.gif

I also haven't seen too many OEM gaming computers (Dell XPS included) with the necessary hardware to hold onto those brackets that happen to be at the end of this 12" card, even though they could hold a 8800GTX easily.;)

I wonder why they didn't post a screenshot of Catalyst Control Center from the Retail Drivers they said they used? They could have easily pointed out the name of the card.
 
Just found a link to this over at Guru3D, from a post on Tomshardware forums:

"While I am required to follow the NDA, the stuff up on Daily Tech today is almost worthless. Yes Anandtech was present in Tunisia (signing Non-disclosure agreements like the Inquirer), why they are posting this stuff is beyond me because their numbers are off. They must be only using the XP drivers and OS because the numbers in CF vs the GTX are very much different. So until I can officially comment on the architecture and the performance.. hold all of this as useless until the rest of the world writes about it."


Source: http://forumz.tomshardware.com/hardware/HD2900-XTX-Benchftopic-234334-days0-orderasc-75.html

First thing I belive from this thread.
From what I have seen through the years even The Inq have more credability than Daily Tech.
 
I bet they paid to can these benchmarks...

Make everyone dissapointed and upset over the card, then deliver something that beats the current 8800's by 20% or so. A ploy to bolster everyone's opinion of ATI :p

The less you expect, the better it'll be.

But, from a marketting perspective and at this point in time, that doesn't make sense at all,
We've been discussing R600 rumors for the past 5-6 months and most of them, were not good at all.
And now, this close to the actual launch, they purposely "want" to show their card, as weak and helpless, against a 6 month old 8800 GTX ?

As it was discussed before, all this hype surrouding R600, just hurts it. Expectations are so high (being so late in the game), that it doesn't matter how well it performs, since people will expect more from it.
If anything, AMD/ATI should've come forth before and showed that all those "bad" rumors, were false. If you have a card that delivers what's expected, why wouldn't you want to show that it creams the competition ?

I still have my doubts about these benchmarks, since it contradicts what others said. Basically, that the XT should be on par with the GTX. However, since they are all rumors and unconfirmed benchmarks, just like what that "supposed" AMD/ATI employee said, we have to take them with a grain of salt.
About the XTX, I strongly believe, it will not be much faster than the XT. ATI has done this at least two times before, so I have no reason to believe it will be different, this time around.
 
Looks to me like there are a lot of people here who are still in doubt regarding the power of the XTX...... :rolleyes:

Saying things like "Its to throw off Nvidia" or "It will be clocked 50Mhz higher in retail so it will beat the GTX" or "Maybe they will surprise us all when the REAL card is benched".

Hmmmm...

yeah right.

or maybe its just not the fastest card. End of story. No smoke without fire. If its not as fast as the GTX, then so be it. Go and buy a GTX instead. No need for the 'OMG it cannot be true. Lies damn lies I tell you.You wait and see. My mate who works at AMD says they tested the wrong card'
 
yeah right.

or maybe its just not the fastest card. End of story. No smoke without fire.

Or maybe everyone (myself included) just needs to wait until the NDA is lifted and some decent reviews are written.;)
But where's the fun in that?
 
I've seen posted on other forums that the huge OEM card DT has is an OEM XT, not the XTX.
 
I really doubt those XTX numbers are what we will end up seeing when it final does ship. Thats why too much horse power with the clock speed, memory, ect to have such a low perfromace threshold. I can not believe any engineer over at AMD/ATI is that stupid to design something that powerful and yet barely out doing last gen. Just does not make any common sense..... But guess we will find out soon enough...



I would be willing to bet that DirectX 9 will be more relevant at least until 2009.

I would rather have a card that performs well in most games, instead of some games.

DirectX 10 will be mostly a marketing term and maybe an unsupported patch to a couple games for another year or so.

Look how long it took SM3.0 to really take hold. It came out with the NVIDIA 6 series but now real games fully supported it until the 7 series.

It could be by the time R700\G90 cards are out that DX10 truly matters.


Except that is not what Epic and Cytex have been telling us as both claim that their next game due out this year willl make full use of DX10 cards. You also have other AAA Titles like HL2: EP2 and TF that it supposedly will make use of some DX10 functinaly. Now to what exctent is still uknow but thats what is what they are saying.

Also SM30.0 vrs DX10 is a huge difference. Almost everything you could do with SM3.0 you could do with SM2.0x. That is not the case this time with DX10. That and the fact that vista will force DX10 on us no matter what, we will see a much quicker adoption of DX10 than we did SM3.0
 
^^ speaketh the truth.


Look at the cards pictured. Look closely. Look at the connectors it has. Wondering a little?

This card, which is supposingly a "final 2900XTX board", is most likely some OEM 2900XT card (most likely considering the scores close to the XT), or some very early engineering sample 2900XTX. If you see what i see from the pictures and then compare it with the features that final 2900XTX is supposed to have, then you know what i mean.

Can't believe that dailytech did not even question it....pretty bad journalism if you ask me.
 
^^ speaketh the truth.


Look at the cards pictured. Look closely. Look at the connectors it has. Wondering a little?

This card, which is supposingly a "final 2900XTX board", is most likely some OEM 2900XT card (most likely considering the scores close to the XT), or some very early engineering sample 2900XTX. If you see what i see from the pictures and then compare it with the features that final 2900XTX is supposed to have, then you know what i mean.

Can't believe that dailytech did not even question it....pretty bad journalism if you ask me.

WTB Hits!!! I'm sure they shot up alot these last 2 weeks I never visit that site, well not until i saw these links ;)
 
^^ speaketh the truth.


Look at the cards pictured. Look closely. Look at the connectors it has. Wondering a little?

This card, which is supposingly a "final 2900XTX board", is most likely some OEM 2900XT card (most likely considering the scores close to the XT), or some very early engineering sample 2900XTX. If you see what i see from the pictures and then compare it with the features that final 2900XTX is supposed to have, then you know what i mean.

Can't believe that dailytech did not even question it....pretty bad journalism if you ask me.

Hey Sherlock, try reading, they say it is an OEM XTX and not a a final retail board.

It has the 1GB of DDR4, so it is not a XT.

Here is what likely happened. There was meant to be a significant clock speed delta between the 1GB XTX and 512MB XT. Most likely 750MHz (or a bit higher, but still not enough to catch GTX) is the fastest they can run R600 reliably. It turns out this is only good enough to challenge the GTS. So the XT gets the 750 MHz speed and the XTX is back to the drawing board to await a respin with much higher clock speed or more execution units to try and catch GTX.
 
Hey Sherlock, try reading, they say it is an OEM XTX and not a a final retail board.

It has the 1GB of DDR4, so it is not a XT.

Here is what likely happened. There was meant to be a significant clock speed delta between the 1GB XTX and 512MB XT. Most likely 750MHz (or a bit higher, but still not enough to catch GTX) is the fastest they can run R600 reliably. It turns out this is only good enough to challenge the GTS. So the XT gets the 750 MHz speed and the XTX is back to the drawing board to await a respin with much higher clock speed or more execution units to try and catch GTX.


Hey Sherlock, try reading yourself a little. I never said "retail", i said "final board", which they're saying it is. It could be OEM, i don't doubt it. And if they say it is something compared to what it really is is most likely 2 pair of shoes. That site is not very credible with this "review"... It just for sure is NOT the XTX anyone will buy in any store and therefor totally worthless as a comparison. The card that they "tested" (and i use this term very losely since i have a feeling that there really was not much testing involved judging by the numbers and the "article") has not the features that the XTX is supposed to have, nor the clock speeds....time to think for yourself and not to blindly believe in what people write in some second class "review"...
 
Argue while you still can because in 5 more days we will definitely see more reliable reviews, until then all of the benchmarks are meaningless to me.
 
Surprise?
I'm with Apollo...maybe they lied about the AIW series getting cancelled? I've owned 3 of these cards and would love to see a 4th installed in my box.
Hmmm... AIW 2900 at XT speeds, a theater 650 chip, Dual Link DVI, HDMI out...I can only dream I guess.
~~~~~~~~~~~~~~~~~~~~~~~~
This is just a theory...

WHAT IF... The XTX DailyTech tested was actually the next FIREGL card? What if all these "OEM" cards floating around are the FireGL cards?

Sounds like a plausible theory, personally at this point I'm just waiting to see the final benchmarks. As I think most people here are.

On another note, how are you getting four AIWs in the same box? Are two of them PCI, or some kind of PCIe AGP hybrid board? Sounds interesting, I'd be curious to see that running. I always kind of assumed that even if you could get them all working on the hardware level the driver would freak out. Pretty impressive though.
 
Sounds like a plausible theory, personally at this point I'm just waiting to see the final benchmarks. As I think most people here are.

On another note, how are you getting four AIWs in the same box? Are two of them PCI, or some kind of PCIe AGP hybrid board? Sounds interesting, I'd be curious to see that running. I always kind of assumed that even if you could get them all working on the hardware level the driver would freak out. Pretty impressive though.

i think he meant he's 3 aiw cards and will get the 2900xt as the 4th one. kinda like how i've owned a 9800pro aiw, x800xt aiw, and perhaps the 2900xt aiw if there is one :)
but i heard ati discontinuing the aiw series :(
 
i can buy a 8800 GTS that will overlock to around 600/1900 that performs at same level as a stock 8800 GTX for around 350. why would I buy the 2900 XT?

You wouldn't even be close.
You would have to get the GTS up to 2,160 MHz on the memory to equal the bandwidth of the GTX.
Then you would have to get the core up to around 770 MHz to get the same computational performance.
 
Hey Sherlock, try reading yourself a little. I never said "retail", i said "final board", which they're saying it is. It could be OEM, i don't doubt it. And if they say it is something compared to what it really is is most likely 2 pair of shoes. That site is not very credible with this "review"... It just for sure is NOT the XTX anyone will buy in any store and therefor totally worthless as a comparison. The card that they "tested" (and i use this term very losely since i have a feeling that there really was not much testing involved judging by the numbers and the "article") has not the features that the XTX is supposed to have, nor the clock speeds....time to think for yourself and not to blindly believe in what people write in some second class "review"...

And why do you, so blindly want to believe, that the XTX will be better than that ?

AFAIK the clock speeds for the XTX were never confirmed. Actually, nothing was ever really confirmed, except the 512 bit memory bus.
There were rumors about 1 GHz clock, but that makes no sense for a 80 nm part. 750-800 Mhz seems about right. And those 50 MHz more, if it's indeed 800 MHz, are not going to make a big difference in performance.
Other rumors mentioned poor yields, which if confirmed, mean that you won't see very high clock/memory frequencies, which again supports no more than 750-800 Mhz for a 80 nm part.

Also, the article clearly says that's NOT the card a consumer will buy, but rather the OEM version. Retail cards, will probably ship with higher clock/ memory frequencies. But that's also not going to mean much. If these benchmarks are true, there's some serious catch up to do, from the XTX. In some cases, over 40 fps and it's not just higher clock and memory frequencies, that are going to magically pull those numbers.
 
Even if the 8800 was stock, I'm betting it would still win.

Those are a few percentage points. That's a landslide! lol

:eek:
 
I believe the scores will improve with further driver optimization, but I don't know how much. This certainly doesn't look like a GTX killer so far. It's more like GTX fodder. :)

They will have to or this card is dead unless it comes it at 30% less than the 8800 GTX. They're going to need one hell of a marketing team to sell a slower card for the same price.
 
Shame, shame, shame, I'm just glad I got a 8800GTS 640MB. I didn't like ther merger from the git go.
 
It is a shame because Nvidia has no reason to drop their prices. Doesn't really affect me as I already overpaid on day one, bah, but it would have been nice to see another price war and some kick ass benchmarks. It seems we forgot how great the 8800s were when they released them. Those are some seriously powerful cards that can pretty much run anything you throw at them.

Unfortunately, those ATI frame rates are so far off the retail o/c 8800 GTX that it's not likely that any driver optimization is going to save them at this point. If they don't come into the market selling those cards at a hefty discounts below the 8800s their earnings in the next few quarters will get throttled. Time for a new marketing plan.

I am however feeling a bit of 'personal' satisfaction not having to ante up for the latest next gen of cards. hehe :p
 
Now all these benchmarks are spitting out frames of 100-150.
Now is it just me or am I the only one who can't tell the difference as long as its 50FPS or more?
 
You wouldn't even be close.
You would have to get the GTS up to 2,160 MHz on the memory to equal the bandwidth of the GTX.
Then you would have to get the core up to around 770 MHz to get the same computational performance.

you're also wrong. idk where the reviews are but if you search you will see benches and see how wrong you are.
 
Now all these benchmarks are spitting out frames of 100-150.
Now is it just me or am I the only one who can't tell the difference as long as its 50FPS or more?

LOL, it's not just you, guarntee'd. ;)

It's debatable but the human eye can only discern a difference below 60-40 fps (most say 40).


OT,

I'm actually kinda glad some folks took the speculations just a little heavy-handed and have come to the conclusions that these benchmarks (canned or not, rushed or not) may not be favorable of the retail hardware or firmware. I was quite worried and disappointed if this was the case. Some of the above, logical, conclusions have ensured me to keep hold my judgment until the paper launch; rumor'd to be May 2nd.
 
you're also wrong. idk where the reviews are but if you search you will see benches and see how wrong you are.


Not to be an ass, as I don't doubt you (yet) but...

I make 2.5 million a year, I don't know where my check stubs or 1040's are, but if you look you'll know that. ;)
 
Now all these benchmarks are spitting out frames of 100-150.
Now is it just me or am I the only one who can't tell the difference as long as its 50FPS or more?

They "spit out" 100-150 in today's games. It's good that they do, otherwise, in future games that will undoubtely push the hadrware even further, gaming would be a slideshow.
 
LOL, it's not just you, guarntee'd. ;)

It's debatable but the human eye can only discern a difference below 60-40 fps (most say 40).


OT,

I'm actually kinda glad some folks took the speculations just a little heavy-handed and have come to the conclusions that these benchmarks (canned or not, rushed or not) may not be favorable of the retail hardware or firmware. I was quite worried and disappointed if this was the case. Some of the above, logical, conclusions have ensured me to keep hold my judgment until the paper launch; rumor'd to be May 2nd.

Yes.....but the human race is greedy. They always want more. More $s....more FPS. lol

These cards truly shine through if you're running insane resolutions like 2560x1600 or some obscure 3d graphics app. They can maintain fantastic frame rates at ultra high resolutions with settings maxed. A lone 8800 GTX can pretty much power anything 1920x1200 and below and in most cases, even 2560x1600, speaking from experience.

This news is not good for AMD though and based on the most recent headline where 'Intel Wipes Out AMD's 2006 Marketshare Gains in One Quarter', expect to see more of the same.
 
Shame, shame, shame, I'm just glad I got a 8800GTS 640MB. I didn't like ther merger from the git go.
I never did either, but we were all hoping it was going to lead to a better market with more competition and superior products. As it stands right now, consumers could end up losing what little competition there was in the graphics and processor markets. I'm wondering more and more if the merger worsened the R600 and Barcelona delays. Compund that with an unforeseen glitch in the choice to use GDDR4 memory looking like it will cause yet another delay, this year must rank as one of the worst in both AMD's and ATI's histories. Everyone, including users who purchased NVidia products should hope to God these benchmarks are not representative of the final products or we're all going to be spending a lot more for our future systems, and I'm not only talking about video cards.
 
Back
Top