ATI Radeon HD 2900 XT @ [H]

Status
Not open for further replies.
Motherboards do have an impact, but it is not generally significant. The best thing a motherboard can do for a video card is in the realm of higher overclocking. Brent and I have different motherboards for our testing rigs, and as such, we get slightly different figures when overclocking. Of course, the most important aspect of overclocking a video card is the video card itself. Some cards just won't do it, and some love it.

You can see the difference in the motherboards in the game benches in motherboard reviews. When the same processor is used, the difference is rarely more than 3% unless there is something wrong.

NVIDIA motherboards are supposed to give NVIDIA video cards a boost in performance with features like GPUEX and Linkboost (which was officially deleted as a feature from the 680i SLI chipset by NVIDIA recently.) These optimizations are for their video cards and just because they exist that doesn't mean that those boards sabotage the performance of ATI video cards.

Any PCIe 1.0a compliant motherboard will basically work about the same for all video cards within an acceptable margin for error. Now due to ATI and NVIDIA driver lockouts you can't compare apples to apples in an SLI/Crossfire test on the same board so your platforms have to differ. On the same motherboard regardless of chipset, the results should be about the same. In case anyone hasn't noticed yet, the so called NVIDIA chipset optimizations for their video cards are basically marketting hype and pretty much worthless.
 
A comparison to the ATI 1950xtx would have been nice just to show where and how ATI/AMD improved itself over its previous generation. I guess we can still infer from the 8800 to 1950 comparisons of old. I would have also liked to see at least one of the game comparisons repeated in Windows Vista. Maybe ATI kicks but in Vista for some reason?

Personally, I keep away from the $$$ video cards and will be very interested to see how the AMD 2600 series when its released compares to the nVidia 8600 series. Hopefully by that time, both parties should have all there driver kinks worked out.
 
Kyle, what about the HDCP features of the card?

i've only seen guru3d mention the HDCP stuff, supposidlly the HD2900XT is also HDCP capable.. hav eyou guys had any chance for testing this? (is there even a way to test this at the moment?)

We mentioned it in the evaluatino, the HDCP keys are built into the ASIC on all R600 family GPUs.
 
Thank you to all involved for a great article. I have been waiting for this day for over a month, and now after a thorough read, my decision has been made. Thanks again for assisting with the upgrade process.
 
Some insight on what may have happened (from within the semiconductor industry but outside the company). It looks to me like the transistor targetting in terms of doping is way too fast. [H] sees this with the great overclockability they see, but very high current draw. What happens is if you dope the transistors too heavily, you end up with very fast transistors but they draw a lot of current. It appears the initial wafers out are targetted way too fast and hence draw a ton of current (but can be clocked super high, it appears 25%+ o/c's are pretty easy).

This should be fixed later to get proper balance, but you have to remember that your first inline snapshot of where the transistors are running is a couple of weeks after the implant steps, so you either build up the initial 2-3 weeks of chips or scrap it all, and then hope your correction inline works out or build them all into cards and wait for the 'improved chips' to come out and make a new card, which will both allow higher stock clocking and less current draw. But, at a 25% overclock (if you have the power to run it), the card should easily compete with the gtx.

If ati gets the devices targetted properly, they could achieve the 20% higher clocks with the same or lower current draw than currently, which would make the card competitive right away. It's a balance that takes a while to get right, so a month or two from now we could see properly targetted transistors that make competitive cards. In the meantime, ati would be foolish to just throw out all of the chips in line, so they initially don't look great, but we should see good performance later. I could be completely wrong, but the high current draw and insane clock potential scream improper transistor targetting.
 
They should have waited. Seriously what would have been two months more or less. Now we have a bunch of useless cards sitting around that no one will buy.
 
@bobzdar

Would these different chips be identifiable by a different batch number?
 
[H]'s review is different from AnandTech's. AnandTech has the r600 performing on-par with the GTS 640.
Both are pretty good sources usually, not sure why they differ so much.
 
It is interesting that you claim to know what GPUs and when those GPUs will be released when I don't even know.

Na just looking at past history and the rumors around the web about when the r650 will come out (which they are saying is the end of summer or early fall ) and it would stand to reason that the nvidia answer will come around the same time . Esp with the first dx 10 games (cyris , conan , call of whatever , lost planet ) coming out at that time , nvidia will also want a new card to sell for those games and also the holiday buying season .

The new FSAA modes may help on edges at high AA settings (which we noted) but they do blur everything else, so you have to decide if this is a tradeoff you are willing to make. I surely am not.

Seems like the review I linked to stated many postives , shimmering is gone , alpha fsaa is done as a side affect and the reviewer claims that it looks better than any nvidia mode avalible . Now of course this is subjective , but it sounds like there is alot more going on then you talked about in your review
 
[R600's] shimmering is gone
From what I can tell, the observed filtering capabilities are exactly the same with R600 as they were with R580, so I don't understand how this could work.
 
Na just looking at past history and the rumors around the web about when the r650 will come out (which they are saying is the end of summer or early fall ) and it would stand to reason that the nvidia answer will come around the same time . Esp with the first dx 10 games (cyris , conan , call of whatever , lost planet ) coming out at that time , nvidia will also want a new card to sell for those games and also the holiday buying season.

I guess this is where we differ, you go on rumors and hearsay. I go on facts.

Seems like the review I linked to stated many postives , shimmering is gone , alpha fsaa is done as a side affect and the reviewer claims that it looks better than any nvidia mode avalible . Now of course this is subjective , but it sounds like there is alot more going on then you talked about in your review

Did you look at our AA screenshot comparison page?

From what I can tell, the observed filtering capabilities are exactly the same with R600 as they were with R580, so I don't understand how this could work.

Cause it blurs textures, if you blurry the textures the "shimmering" appears to be reduced because it is more blurry. Of course now you have a blurry texture. Maybe some people prefer to have blurry textures, I surely don't.
 
The ananadtech review shows the 2900xt and 8800gts trading blows in certain games.

So it is safe to say they are about equal.
 
The outcome is not exactly unexpected. Any time you have a major merger, especially with significant cultural differences, you're going to have some internal friction. Internal friction can often generate adverse outcomes in the marketplace.

Barcelona is delayed, and the HD2900XT is lacking in performance and efficiency. Makes sense when you think about it, really.
 
I guess this is where we differ, you go on rumors and hearsay. I go on facts.

well i guess since there is no new ati or nvidia parts announced after the geforce 8800 ultra and the radeon hd 2900xt then ati and nvidia will never make another high end part again . Cause anything else is just rumors and hearsay . and of course if there were to ever be a new part from either company they wouldn't want to release it around the christmas shoping season because frankly there is no money to be made there

Did you look at our AA screenshot comparison page?

I don't play screen shots .

Once again from the tech report review

This CFAA mode with 8 samples produces extremely clean edges and does an excellent job of resolving very fine geometry, like the tips of the spires on the cathedral. Even 16X CSAA can't match it. Also, have a look at the tree leaves in these shots. They use alpha transparency, and I don't have transparency AA enabled, so you see some jagged edges on the GeForce 8800. The wide tent filter's subtle blending takes care of these edges, even without transparency AA.

You may not be convinced yet, and I don't blame you. CFAA's tent filters may not be for everyone. I would encourage you to try them, though, before writing them off. There is ample theoretical backing for the effectiveness of tent filters, and as with any AA method, much of their effectiveness must be seen in full motion in order to be properly appreciated. I prefer the 4X MSAA + wide tent filter to anything Nvidia offers, in spite of myself. I've found that it looks great on the 30" wide-screen LCD attached to my GPU test rig. The reduction in high-frequency pixel noise is a good thing on a sharp LCD display; it adds a certain solidity to objects that just.. works. Oblivion has never looked better on the PC than it does on the Radeon HD 2900 XT
 
The ananadtech review shows the 2900xt and 8800gts trading blows in certain games.

So it is safe to say they are about equal.

Possibly the the 2900XT and GTS are about equal in other benchmarks across the web, but I am still confused why the numbers from [H] reviews are so low. I mean the [H] review numbers are lower than any that I have seen. I am not saying, in any way, shape or form that [H] was bribed by NV and forced to show the R600 in a bad light, I am just saying that maybe the card was used by other sites first and somehow parts of it were broke down? Maybe it was one of the models with the supposed 256 bus instead of the 512? Maybe I was just hoping for the R600 to be a great card(considering the impressive specs), I sure didn't want the card to be a flop. A flop is bad for ATI and bad for the consumers where the price on competitor cards are not lowered.

Nonetheless, I still liked reading [H]'s review, and if I get a reply acknowledging that the card was in mint condition during testing, then the numbers are most likely 100% credible. This in turn leads to me crying in a corner because the 8800GTS 640 isn't going to cost ~$250 anytime soon.
 
[H]'s review is different from AnandTech's. AnandTech has the r600 performing on-par with the GTS 640.
Both are pretty good sources usually, not sure why they differ so much.

I just read that review and I only saw that happen in some of the tests. There were plenty of tests where the HD 2900XT failed to match the performance of the 8800GTS 640MB.

Let's cover all the games shall we? (Victories will be counted between the 8800GTS and the HD 2900XT

Oblivion Test 1-ATI
Oblivon Test 2 -NVIDIA
Prey Test 1 -ATI
Prey Test 2 -ATI
Rainbow Six Vegas Test 1 -ATI
STALKER -NVIDIA
Supreme Commander -NVIDIA (8800GTS 320MB)

ATI=4 NVIDIA=3

SLI/Crossfire Tests:

Battlefield 2 Test 1 -NVIDIA
Battlefield 2 Test 2 -NVIDIA
Oblivion Test 1 -ATI
Oblivion Test 2 -NVIDIA
Prey Test 1 -ATI
Prey Test 2 -ATI
Rainbow Six Vegas -NVIDIA
STALKER -NVIDIA

ATI=3 NVIDIA=5

Overall winner -NVIDIA

Both GPUs scored the same number of wins according to Anandtech.

This was taken from the Anandtech Review:

"Maybe that's a lot to digest, but the bottom line is that R600 is not perfect nor is it a failure. The HD 2900 XT competes well with the 640MB 8800 GTS, though the 8800 GTS 320MB does have a price/performance advantage over both in all but the highest resolutions and AA settings under most current games. "

Now this conclusion baffles me. For one even if it did compete well against the 8800GTS 640MB, (which it does less than half the time) it does so by using ALOT more power, generating more heat, requiring a 8pin PSU for overclocking, and having a noiser fan. Even *IF* the card does improve by 10-15% in the next few months thanks to driver improvements, there is no changing the disadvantages the 2900XT has in terms of hardware when compared to the 8800GTS 640MB.

Factor all this in and I have to question the validity of some of the nicer reviews on the web. In my mind it's clear, even if the performance were the same, the card is still a loser for ATI when the other factors are considered. Anandtech either didn't factor in those other things or they are just being nice to ATI. I wouldn't quite call the card a failure because it performs ok and the price isn't too unreasonable, but I wouldn't call it a success either as it was late to the party and doesn't seem to match the competition on a number of levels.

Anyone who buys this card is either a hardcore AMD fan-boy or hasn't thought their purchase out or have done the necessary research before buying.
 
I'm not attacking brent , its the other way around

See this post

http://www.hardforum.com/showpost.php?p=1031055413&postcount=555

Quote:
Originally Posted by gljvd
Some reviewers like the new fsaa modes . One even states its better than nvidia's

The new FSAA modes may help on edges at high AA settings (which we noted) but they do blur everything else, so you have to decide if this is a tradeoff you are willing to make. I surely am not.

clearly the reviewer thought that the bluring was minor (and even in the screen shots its minor , imagine when its actually moving !)

Yet he continues to say it has blur .

So he is willing to scarfice higher quality fsaa for what every other site i've seen claims is minor bluring ?
 
I think they need to take another look at their numbers, as SupCom apparently has better FPS when using 8xAA than 4xAA @ 2560x1600

HAHA I guess thats what you get with 512-bit....

They probaby loaded up the game and played it for about 2 minutes each.... hardly going to get a good benchmark, or ANYTHING from that...
 
Possibly the the 2900XT and GTS are about equal in other benchmarks across the web, but I am still confused why the numbers from [H] reviews are so low. I mean the [H] review numbers are lower than any that I have seen. I am not saying, in any way, shape or form that [H] was bribed by NV and forced to show the R600 in a bad light, I am just saying that maybe the card was used by other sites first and somehow parts of it were broke down? Maybe it was one of the models with the supposed 256 bus instead of the 512? Maybe I was just hoping for the R600 to be a great card(considering the impressive specs), I sure didn't want the card to be a flop. A flop is bad for ATI and bad for the consumers where the price on competitor cards are not lowered.

Nonetheless, I still liked reading [H]'s review, and if I get a reply acknowledging that the card was in mint condition during testing, then the numbers are most likely 100% credible. This in turn leads to me crying in a corner because the 8800GTS 640 isn't going to cost ~$250 anytime soon.


Hardocp tests what they feel is playable , of course they don't show us anything but what they feel is playable which is where all this comes apart . We have to trust that x benchmark is only playable with x features on . Of course it may be that you simply loose 2 fps and its neglected to be mentioned .

There are some games where other sites (anand , tech review and others ) got higher scores at higher fsaa and resolutions .

The numbers are def off here at hardocp because other sites show the the r600 trading blows with the gts 640 and in one or two benchmarks across the web its reaching into gtx performance and in some its only gts 320 performance and yet in others which are obviously bugs in the driver its below x1950xtx
 
clearly the reviewer thought that the bluring was minor
Clearly Brent feels otherwise. Judging from what screenshots I've seen, I'd say I agree with him.

Yet he continues to say it has blur
That's because it does. One usually labels something as being true when it is, in fact, true.

every other site i've seen claims [Tent filtering introduces] minor bluring
Every other site?

The numbers are def off here at hardocp because other sites show the the r600 trading blows with the gts 640 and in one or two benchmarks across the web its reaching into gtx performance and in some its only gts 320 performance and yet in others which are obviously bugs in the driver its below x1950xtx
HardOCP doesn't test in the manner that other sites do, if you haven't noticed. This means that their results will differ; an obvious given; a fundamental cause-and-effect relationship. The numbers aren't "off", they are just different than what results other reviewers have observed.
 
This reminds me of when the 9700 came out.

That card was such a success not because of its DX9 performance (which was all speculative at the time) but because it was far and away the fastest thing for the then currently available DX8 games.

I don't care if the HD2900XT turns out faster in the games we'll be playing two years from now than the 8800GTS. What matters to me is which card is better for playing the games I have on my shelf, because those are the ones I'll be playing on it.
 
Well that's a let down. I know a lot of people (including myself) were expecting ATI to make some big leaps with this one. Oh well.. better luck next gen, ATI. (assuming this debacle affords you a next-gen release :rolleyes: )


Guess I'm glad I bought my 8800GTS though.
 
well i guess since there is no new ati or nvidia parts announced after the geforce 8800 ultra and the radeon hd 2900xt then ati and nvidia will never make another high end part again . Cause anything else is just rumors and hearsay . and of course if there were to ever be a new part from either company they wouldn't want to release it around the christmas shoping season because frankly there is no money to be made there



I don't play screen shots .

Once again from the tech report review

i just looked at their review, and apparently you do indeed play screenshots. as thats all their AA review is. not as indepth as [H]'s either. just tiny comments. And it appears you play 3dmark06 as well... how is that by the way? haven't had a chance to beat it yet :rolleyes:
 
Factor all this in and I have to question the validity of some of the nicer reviews on the web. In my mind it's clear, even if the performance were the same, the card is still a loser for ATI when the other factors are considered. Anandtech either didn't factor in those other things or they are just being nice to ATI. I wouldn't quite call the card a failure because it performs ok and the price isn't too unreasonable, but I wouldn't call it a success either as it was late to the party and doesn't seem to match the competition on a number of levels.

Anyone who buys this card is either a hardcore AMD fan-boy or hasn't thought their purchase out or have done the necessary research before buying.

as you pointed out it ties the gtx at anandtech in terms of bencharmks (except battlefield but its an obvious driver bug)

The ati card comes with valve black box and starts at $400 (yes some places are charging more but that is common when a card is first rleases and the inital demand hits ) With the black box we are looking at most likely a $30 value so your already down to $370 ish which is right there with the gts 640 meg prices .

Then factor in hdmi audio for those who use their pcs for htpc and other things .

Folding client that will soon come out

Physics capablitys

The tessliator (sp?)

avivo

You don't have to be an amd fan boy to pick it up . Also this is the radeon hd 2900xt with release drivers vs 6 month more mature drivers on the geforce 8800gts . Going foward things can change in ati's favor (or nvidia's ) So at any driver release the radeon could end up wining more benchmarks than its loosing
 
Man, I feel bad for you guys at [H], you try so hard to make decent reviews of what people will ACTUALLY experience, and get flamed almost everytime for it. At least you know the majority of your readers enjoy it and will use them to their own good. I bought my 8800 GTS 320 because of the review you guys did... Happy I saved so much money over the 640. By the time 320MB is way too little I'll be able to upgrade I'm sure.

Anyways, just hope you guys keep it up, and are able to take all the flak that ones who don't agree with you give out (which you already do really well with). Keep the awesome reviews coming! +1 happy customer.
 
as you pointed out it ties the gtx at anandtech in terms of bencharmks (except battlefield but its an obvious driver bug)

The ati card comes with valve black box and starts at $400 (yes some places are charging more but that is common when a card is first rleases and the inital demand hits ) With the black box we are looking at most likely a $30 value so your already down to $370 ish which is right there with the gts 640 meg prices .

Then factor in hdmi audio for those who use their pcs for htpc and other things .

Folding client that will soon come out

Physics capablitys

The tessliator (sp?)

avivo

You don't have to be an amd fan boy to pick it up . Also this is the radeon hd 2900xt with release drivers vs 6 month more mature drivers on the geforce 8800gts . Going foward things can change in ati's favor (or nvidia's ) So at any driver release the radeon could end up wining more benchmarks than its loosing
you're advantages are mostly meaningless. the tessellater must be programmed in game to be used

anyone who puts this card in a HTPC is asking for overheating. not to mention you'd need a damn big HTPC case for it...

avivo... ive seen nothing special about it... got any links?

and i think theyre working on a G80 folding client, im not sure but one of the head honchos for the FAH GPU was here earlier.

Kyle linked $330 GTS's anyway. so figure you're losing 70 bucks until ep2 and everything comes out...
 
The ati card comes with valve black box and starts at $400 (yes some places are charging more but that is common when a card is first rleases and the inital demand hits ) With the black box we are looking at most likely a $30 value so your already down to $370 ish which is right there with the gts 640 meg prices.
You can get the 8800 640MB card for $329 w/MIR. Adding goodies doesn't make up for the lack of performance.
 
I'm not attacking brent , its the other way around

See this post

http://www.hardforum.com/showpost.php?p=1031055413&postcount=555



clearly the reviewer thought that the bluring was minor (and even in the screen shots its minor , imagine when its actually moving !)

Yet he continues to say it has blur .

So he is willing to scarfice higher quality fsaa for what every other site i've seen claims is minor bluring ?

Whether or not the blurriness is purely subjective, is having 1/3 the speed when using these modes subjective? Oh, sign me up for that!
 
Clearly Brent feels otherwise. Judging from what screenshots I've seen, I'd say I agree with him.

I'm sorry can you take pictures of yourself holding your ati radeon hd 2900 xt that you used to view the fsaa modes ?


That's because it does. One usually labels something as being true when it is, in fact, true.
Bluring something and bluring everything else on the screen as claimed are two diffrent things


Every other site?

go and read the reviews . The reviewers are saying in motion it looks great .



HardOCP doesn't test in the manner that other sites do, if you haven't noticed. This means that their results will differ; an obvious given; a fundamental cause-and-effect relationship. The numbers aren't "off", they are just different than what results other reviewers have observed.


Yet they don't paint a true picture of whats going on . They are in direct contrast with many other sites on the net . Why ? because unlike other sites they don't run the gambit of resolutions and settings
 
as you pointed out it ties the gtx at anandtech in terms of bencharmks (except battlefield but its an obvious driver bug)

The ati card comes with valve black box and starts at $400 (yes some places are charging more but that is common when a card is first rleases and the inital demand hits ) With the black box we are looking at most likely a $30 value so your already down to $370 ish which is right there with the gts 640 meg prices .

Then factor in hdmi audio for those who use their pcs for htpc and other things .

Folding client that will soon come out

Physics capablitys

The tessliator (sp?)

avivo

You don't have to be an amd fan boy to pick it up . Also this is the radeon hd 2900xt with release drivers vs 6 month more mature drivers on the geforce 8800gts . Going foward things can change in ati's favor (or nvidia's ) So at any driver release the radeon could end up wining more benchmarks than its loosing

For one, you can't count bundled software against the price.... that's all personal preference. Should we take the BFG 8800GTS and -$30 because it comes with a BFG t-shirt and another -$10 for the awesome stickers? Plus MIR's and you're almost at 8800GTS 320MB prices :eek:

I don't think you have many agreeing with you, so I'll just leave it at that....
 
Possibly the the 2900XT and GTS are about equal in other benchmarks across the web, but I am still confused why the numbers from [H] reviews are so low. I mean the [H] review numbers are lower than any that I have seen. I am not saying, in any way, shape or form that [H] was bribed by NV and forced to show the R600 in a bad light, I am just saying that maybe the card was used by other sites first and somehow parts of it were broke down? Maybe it was one of the models with the supposed 256 bus instead of the 512? Maybe I was just hoping for the R600 to be a great card(considering the impressive specs), I sure didn't want the card to be a flop. A flop is bad for ATI and bad for the consumers where the price on competitor cards are not lowered.

Nonetheless, I still liked reading [H]'s review, and if I get a reply acknowledging that the card was in mint condition during testing, then the numbers are most likely 100% credible. This in turn leads to me crying in a corner because the 8800GTS 640 isn't going to cost ~$250 anytime soon.

Our testing methodology seems to differ in alot of areas especially where video cards are concerned. This is true of motherboard reviews as well. We often get flak because our conclusions aren't the same as those of some other sites around the web. People are always crying "your hardware must be broken". I still get e-mail comments on the fact that the original Striker Extreme review I did had such poor overclocking results. I took an additional two weeks to revisit the issue and the results still stand. With a new BIOS and some slightly different test equipment the results were slightly better the second time around, but the board wasn't broken and it was still an overclocking dud. Testing with a different board also showed better overclocking results, but given all the factors we still basically slammed the board and never did recommend it vs. any other 680i board on the market.

Believe it or not it's bad for business to write bad reviews of products on the web. HardOCP makes money off of advertisement. Think about it: Who wants to advertise on a site that slams your product? The answer is no one does. I know Kyle has had more than one angry conversation with companies who get pissed off about the content of a review posted on the site. I respect the fact that Kyle sticks to his guns and posts the truth. This has led to some companies to stop shipping samples to the [H] and to ignore us all together. Engineers of some of these products have contacted Kyle about reviews and worked with him to try and resolve any errors that they saw in our testing and testing methods. Additionally, sometimes we'll contact a manufacturer and try and find a reason why things aren't working as well as they are supposed to and they simply don't care to respond. The data it is what it is and there is nothing that will change it given the test methods that are currently the standard here at the [H]. You have to look at the testing methods and weigh the information you see here vs. the information on another site and make your conclusions from that. I'd always suggest reading 5 or 6 reviews about a given product and draw your conclusions from that. But you seriously have to guage the testing methods and look for the differences. I always advise looking at what you will be doing with the product in question and see which reviews address your concerns the most closely.

The point to this post? We stand by our results, but if someone disagrees and calls us out and it's warranted, the data will get examined again and the product re-tested if necessary. Sometimes the article will be revisited at another time when drivers or software changes happen The data is there, and there is no secret to how we acheived results in a given review of any product. It's up to you to decide which reviews are more credible.

We have a job to do and it doesn't always make us popular.
 
In looking at the picture below, the 2900 XT's Core Clock is listed as 743MHz, whereas the higher-performing 8800 GTS/GTX have lower Core Clocks at 500MHz/575MHz. Is it odd that the GPUs with the lower Core Clock speeds dominated so much; could this be potential evidence of possible driver issues? The SP Clock speeds for the 8800 GTS/GTX are 1.2GHz/1.35GHz -- does this have something to do with the 8800 GTS/GTX out porforming the 2900 XT despite lower Core Clock speeds? Why isn't there an SP Clock speed listed for the 2900 XT? What is the "SP Clock" anyway, and why weren't the Core Clock speeds a bigger factor in performance between these cards?

1178941045MH7ld2qVcW_7_1.gif



Sorry for the newbish questions! If someone could help clear this up for me, that would be awesome!!!

BTW... is there gonna be an XTX or not... I'm not sure I'm clear on this. If so, will it be coming out soon, or is there gonna be another ridiculous wait we'll have to endure? Frankly, in addition to being really disappointed with ATI at this point, I'm VERY sick of waiting. Unless there is a DAMN GOOD and VERY compelling reason to wait longer (either for XTX or for the XT to improve), I'm gonna get me a GTX!! SOON!!! I'm betting Nvidia is gonna be selling them like crazy over the next month because of all the peeps (like myself) who have waited for the ATI card before deciding -- the decision seems pretty clear now..
 
I just read that review and I only saw that happen in some of the tests. There were plenty of tests where the HD 2900XT failed to match the performance of the 8800GTS 640MB.

Let's cover all the games shall we? (Victories will be counted between the 8800GTS and the HD 2900XT

Oblivion Test 1-ATI
Oblivon Test 2 -NVIDIA
Prey Test 1 -ATI
Prey Test 2 -ATI
Rainbow Six Vegas Test 1 -ATI
STALKER -NVIDIA
Supreme Commander -NVIDIA (8800GTS 320MB)

ATI=4 NVIDIA=3

SLI/Crossfire Tests:

Battlefield 2 Test 1 -NVIDIA
Battlefield 2 Test 2 -NVIDIA
Oblivion Test 1 -ATI
Oblivion Test 2 -NVIDIA
Prey Test 1 -ATI
Prey Test 2 -ATI
Rainbow Six Vegas -NVIDIA
STALKER -NVIDIA

ATI=3 NVIDIA=5

Overall winner -NVIDIA

Both GPUs scored the same number of wins according to Anandtech.

This was taken from the Anandtech Review:

"Maybe that's a lot to digest, but the bottom line is that R600 is not perfect nor is it a failure. The HD 2900 XT competes well with the 640MB 8800 GTS, though the 8800 GTS 320MB does have a price/performance advantage over both in all but the highest resolutions and AA settings under most current games. "

Now this conclusion baffles me. For one even if it did compete well against the 8800GTS 640MB, (which it does less than half the time) it does so by using ALOT more power, generating more heat, requiring a 8pin PSU for overclocking, and having a noiser fan. Even *IF* the card does improve by 10-15% in the next few months thanks to driver improvements, there is no changing the disadvantages the 2900XT has in terms of hardware when compared to the 8800GTS 640MB.

Factor all this in and I have to question the validity of some of the nicer reviews on the web. In my mind it's clear, even if the performance were the same, the card is still a loser for ATI when the other factors are considered. Anandtech either didn't factor in those other things or they are just being nice to ATI. I wouldn't quite call the card a failure because it performs ok and the price isn't too unreasonable, but I wouldn't call it a success either as it was late to the party and doesn't seem to match the competition on a number of levels.

Anyone who buys this card is either a hardcore AMD fan-boy or hasn't thought their purchase out or have done the necessary research before buying.

This is annoying to me as well. The conclusions on many of these other reviews try to have it both ways--they note the same problems that [H] did, then turn right around and talk about "good value," "good architecture," and "good potential." It's like the reviewers just can't get the AMD marketing powerpoint slides out of their heads no matter how hard they try, and they end up talking out of both sides of their ass.

Meanwhile their split-personality conclusions give plenty of fuel to the trolls and zealots who can't accept reality. Of course there's another category--people on someone's payroll.
 
well i guess since there is no new ati or nvidia parts announced after the geforce 8800 ultra and the radeon hd 2900xt then ati and nvidia will never make another high end part again . Cause anything else is just rumors and hearsay . and of course if there were to ever be a new part from either company they wouldn't want to release it around the christmas shoping season because frankly there is no money to be made there

I don't play screen shots .

Once again from the tech report review

Thank you for your opinions
 
I'm sorry can you take pictures of yourself holding your ati radeon hd 2900 xt that you used to view the fsaa modes ?
That's why I said "judging from the screenshots I've seen". I don't understand why you believe this to be ambiguous, or why you seem to have some difficulty understanding that statement. I suggest slowing down and thoroughly reading replies before responding.

Bluring something and bluring everything else on the screen as claimed are two diffrent things
One of the side-effects of Tent filtering seems to be blurring of all angled surfaces. Ergo, the entire scene, save for certain surfaces, will be "blurred".

go and read the reviews . The reviewers are saying in motion it looks great .
Great. That's a fair opinion to have. Brent does not share that opinion, which does not make his stance "wrong", nor does it invalidate the evaluation.

Yet they don't paint a true picture of whats going on ...Why ? because unlike other sites they don't run the gambit of resolutions and settings
The correct word here would be "gamut".

What is a "true picture"? Is there such a thing in this context? [H] shows things in a particular way that is different than the manner in which other sites portray these products. Thusly, their conclusions will be different. I thought this was a basic concept for one to grasp.
 
Status
Not open for further replies.
Back
Top