ATI Radeon HD 3870 X2 CrossFireX @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,727
ATI Radeon HD 3870 X2 CrossFireX - The power of 4 GPUs is now possible with AMD’s ATI CrossFireX technology. We evaluate two Radeon HD 3870 X2 cards in CrossFireX on an Intel Core 2 Quad system, powering Crysis, COD 4, and UT3 seeing the gameplay experience produced compared to a Radeon HD 3870 X2 and a Radeon HD 3870.

Quad CrossFireX is appealing from a technical hardware enthusiast standpoint. However, it doesn’t exactly produce the gameplay experience advantages needed to live up to the price tag required. The one game that really needs it, doesn’t utilize it very efficiently yet, and the other games that do already have a great gaming experience on just one Radeon HD 3870 X2.
 
dam that crysis game.. i think there is somthing wacko with their code...

nice review. a lot of graphical power.
 
Being a triple display user, I might just jump onto the AMD platform now just for the multiple display support while still having Multiple GPU available.

Now this maybe will light a fire under NVIDIA's butt to do something about that?
 
Very nice review, I especially appreciated the discussion on the Crysis AI issue etc. Anyway, is this the new testbed for games?
 
I can't wait for them to release this. My second X2 has been sitting on my desk for a month now, lol.

And I can definitely attest to the Crysis performance fluctuation. I finally got to the last level 2 nights ago, and the game is definitely unplayable. And from what I noticed, when the alien exosuit is standing still on the deck of the ship, I can get a good 25-35 fps at 1920x1200 [everything high]. the minute you fire or engage it and it takes a step, the fps drops to about 15...it MUST be something in the game code. In fact, you don't even have to be looking at the exosuit to get the performance drop. You can go to the edge of the ship and look out at the water and get good frames, but if the thing takes a step, it drops.
 
I can't wait for them to release this. My second X2 has been sitting on my desk for a month now, lol.

And I can definitely attest to the Crysis performance fluctuation. I finally got to the last level 2 nights ago, and the game is definitely unplayable. And from what I noticed, when the alien exosuit is standing still on the deck of the ship, I can get a good 25-35 fps at 1920x1200 [everything high]. the minute you fire or engage it and it takes a step, the fps drops to about 15...it MUST be something in the game code. In fact, you don't even have to be looking at the exosuit to get the performance drop. You can go to the edge of the ship and look out at the water and get good frames, but if the thing takes a step, it drops.


If it game code, why is it Nvidia card dot suffer the same kind of performance drop? And dont give me TWIMTBP BS line either. I've got the answer for ya. ATI cards can't handle the action that goes on in games. Hopefully the R7xx line will fix this issue as the entire R6xx line has been a disaster.
 
I wonder how much the benchmarks will go up after some good driver updates...


Great article, guys.
 
Nice write-up.
I was surprised that you were surprised this "worked" the way it was supposed to.
I guess it was in reference to the older Cross-fire setups.
Seems that for now, SLi and Triple-Sli are still the way to go. I have been able to run Crysis 19x12 all very high settings with triple-sli since it was released. Crossfire may just need some time for the drivers to catch up though.

I am hoping this technology will generate renewed competition and continued improvements in multi-GPU platforms.
 
If it game code, why is it Nvidia card dot suffer the same kind of performance drop? And dont give me TWIMTBP BS line either. I've got the answer for ya. ATI cards can't handle the action that goes on in games. Hopefully the R7xx line will fix this issue as the entire R6xx line has been a disaster.

I don't pretend to know how a game is coded, so maybe you are right. But shouldn't game AI be independent of the GPU? I would think that's all done by the cpu. Either way, not trying to hijack this thread, just saying I experienced the same thing.
 
Excellent review as always and I like that you guys are including more apples to apples testing as well.

The only thing that was missing was putting a 3870 X2 and a normal 3870 in the Crossfire X and testing them.
 
I built a new rig 3 weeks ago (Intel CD2 Wolfdale @4.0GHz, 4GB Dominiator, etc. etc.) and a spanking new 3870x2 and after reading this review I just may go out and pick another one up.

I really think ATi is on the right track again and putting out products that are worthy of enthusiast dollars again. I know it's not needed but it would be nice to hear from the [H] writing staff that the 3870X2 and Crossfire X really is a good step forward for a company that has had alot of low's lately.

I love my 3870X2 and I cannot wait to match it up with another one.
 
Being a triple display user, I might just jump onto the AMD platform now just for the multiple display support while still having Multiple GPU available.

Now this maybe will light a fire under NVIDIA's butt to do something about that?

I ditched SLI and jumped on the 3870X2 just because of this. It may seem trivial to toggle SLI on/off, but when you have to do it twice every single time you want to play a game, it becomes very frustrating. Then there are the times when forceware decides to change my desktop resolution as well during toggles. I run 30 + 20 inch displays and all of my programs in windows and this causes everything to be resized and repositioned.
 
Brent, this is the best Video Card Review I ever read. Well done.

As far as the CrossfireX is concerned, uhmmm, If it was just a bit cheaper then I would ditch my GTX and get it. The performance is obvioulsy here, and the fact that AMD is the king of multimon support makes me want to go AMD.

I wonder how these cards do with HiDef movies, and how it scales it to a 30" (2560x1600) monitor. Anyone out there with the gear to test this?
 
There are two ways to win a war....brute force in one big battle or slowly winning smaller fights. It seems that nVidia has chosen the first while ATI has chosen the second. The question is...who made the right choice?

Question for the crew @ [H], if Crysis would have scaled as well as it did in CoD4....would your conclusion been completely different?
 
The only major issue we have with two Radeon HD 3870 X2s in CrossFireX is the fact that each GPU only has access to 512MB of RAM. This just simply is not enough for the kind of pixel pushing power four GPUs are capable of. 512MB of RAM is limiting to this setup. We feel that a “2 GB card” i.e. 1 GB of RAM per GPU might allow higher AA settings in COD 4 and UT3.

The only issue here is that two "2GB" cards would essentially make a 32-bit OS (like most installs of XP or Vista) unuseable, due to the 4GB memory limit. Even one 2GB card would cap your useable system RAM at less than 2GB, therefore beginning the "robbing Peter to pay Paul" situation we're coming to with expanding GPU RAM.

(For those unfamiliar with how this works, Dan's Data has an excellent writeup here.)
 
Excellent review as always and I like that you guys are including more apples to apples testing as well.

The only thing that was missing was putting a 3870 X2 and a normal 3870 in the Crossfire X and testing them.

yeah, it'll be helpful and interesting to see a triple card evaluation too
otherwise, very interesting review! all the multimon ppl are probably drooling now :rolleyes:
 
I wonder which setup is superior: 3870x2 with 16 lanes available (single slot), or 9600GT SLI with 8x8 lanes available. I just might make the switch back to Ati. I tire of SLI woes.
 
There are two ways to win a war....brute force in one big battle or slowly winning smaller fights. It seems that nVidia has chosen the first while ATI has chosen the second. The question is...who made the right choice?

Don't you mean the other way around ? Brute force is what ATI is doing by trying to beat NVIDIA's single GPU solutions, with a dual GPU solution.
It's been shown countless times now, that G80 and its derivatives, are much more efficient than R600 and its derivatives, so brute force is definitely not on NVIDIA's side...

Good review btw! That "bug" with the water in CoD4 is indeed very weird. It would seem that with everything on, the X2 CrossfireX setup, would manage to be very playable (considering its above 70 fps average), but the "bug" or whatever is happening, really kills performance.
 
Maybe for crysis you should have ran in in dx9 mode, since you weren't even using any settings on very high. But I like the addition of playable options on lower resolutions, since some of us cant run the super high resolutions in some of the reviews. Good review!
 
How did it play in any other game? WiC? Lotro? Witcher? Hellgate? Portal? HL:2?

though the information presented is interesting in and of itself, i'd have liked to see some more games go through the wringer as well.

i know the evaluations take alot of time but it'd be nice to see some results from at least a couple of games which are not fps.

perhaps all other games simply bow to ati's incredible crossfirex might and aren't useful as proving grounds? :p
 
Fantastic article.
Very in depth with all the discussion of "other" playable settings.
I like to see the upgrade to the Qx9650
You really covered the bases with your discussion of individual settings and etc.
Not to sound overdramatic here, but I think this is a perfect example of how your "real world" testing method should be.

The Crysis result makes me wonder if Crysis really is the unrefined, unoptimized bug fest some people are claiming it is. I was left with one question: I wonder how DX9 mode would impact the Crysis results? Didn't I hear some discussion from the ATI/AMD team about how, for now, DX9 titles will scale better with multi GPUs than DX10 will?

Ah. http://www.pcper.com/article.php?aid=523
PCPER: Why does DX10 have more trouble working with 4 GPUs than DX9 does? I thought there was a DX9 limitation to frames rendered ahead originally?


AMD: The limitations in DX9 actually make it easier for an application to be AFR friendly. The biggest issue is DX10 has a lot more opportunities for persistent resources (resources rendered or updated in one frame and then read in subsequent frames). In DX9 we only had to handle texture render targets, which we have a good handle on in the DX10 driver. In addition to texture render targets DX10 allows an application to render to IBs and VBs using stream out from the GS or as a traditional render target. An application can also update any resource with a copy blt operation, but in DX9 copy blt operations were restricted to offscreen plains and render targets. This additional flexibility makes it harder to maximize performance without impacting quality.



Another area that creates issues is constant buffers, which is new for DX10. Some applications update dynamic constant buffers every frame while other apps update them less frequently. So again we have to find the right balance that generally works for quality without impacting performance.



We are also seeing new software bottlenecks in DX10 that we continue to work through. These software bottlenecks are sometimes caused by interactions with the OS and the Vista driver model that did not exist for DX9, most likely due to the limited feature set. Software bottlenecks impact our multi-GPU performance more than single GPU and can be a contributing factor to limited scaling.
 
Very nice review, I especially appreciated the discussion on the Crysis AI issue etc. Anyway, is this the new testbed for games?

Yes

Excellent review as always and I like that you guys are including more apples to apples testing as well.

The only thing that was missing was putting a 3870 X2 and a normal 3870 in the Crossfire X and testing them.

Born purely out of reader feedback.

Consider that we may be doing further CrossFireX testing in the future, so other comparisons may be made.

I built a new rig 3 weeks ago (Intel CD2 Wolfdale @4.0GHz, 4GB Dominiator, etc. etc.) and a spanking new 3870x2 and after reading this review I just may go out and pick another one up.

I really think ATi is on the right track again and putting out products that are worthy of enthusiast dollars again. I know it's not needed but it would be nice to hear from the [H] writing staff that the 3870X2 and Crossfire X really is a good step forward for a company that has had alot of low's lately.

I love my 3870X2 and I cannot wait to match it up with another one.

I think CrossFireX is a VERY positive step forward for AMD right now, they were caught with their pants down when SLI was launched, but they have since not only caught up, but surpassed SLI in many ways. I love the multi-display support, this is something I have been wanting to see on SLI for YEARS now.

Brent, this is the best Video Card Review I ever read. Well done.

Thanks! You are quite welcome.
 
though the information presented is interesting in and of itself, i'd have liked to see some more games go through the wringer as well.

i know the evaluations take alot of time but it'd be nice to see some results from at least a couple of games which are not fps.

perhaps all other games simply bow to ati's incredible crossfirex might and aren't useful as proving grounds? :p
Most of the games mentioned would be highly playable on a single high end GPU today at high settings, so personally I was very satisfied by only the COD4, UT3 and Crysis results, since those are the "premier" titles right now.
However, it would be interesting to see how those other engines scale with multi GPUs. I'm sure somebody will explore that.
 
Most of the games mentioned would be highly playable on a single high end GPU today at high settings, so personally I was very satisfied by only the COD4, UT3 and Crysis results, since those are the "premier" titles right now.
However, it would be interesting to see how those engines scale with multi GPUs. I'm sure somebody will explore that.

for sure, that's what i'm interested in.

wic on a single gpu at high settings/res/aa is really pushing the single high end card, from what i've seen. it gets much better with the addition of a second... i'd love to see what happens when 4 are thrown at it.
 
This was an astounding review guys! Truth be told, I think you should be linking back to this article more than your "Benchmarking the Benchmarks" article! This article, especially the Crysis page and also showing the water bug in Cod4 so clearly and undisputably showed how it's important to actually PLAY the games when writing a review/evaluation than just running a benchmark or running through the forest for 30 seconds with no AI. Great job guys!
 
for sure, that's what i'm interested in.

wic on a single gpu at high settings/res/aa is really pushing the single high end card, from what i've seen. it gets much better with the addition of a second... i'd love to see what happens when 4 are thrown at it.
I think either Hothardware or Xbitlabs uses WiC in their standard test suite.
 
Great review guys!

Sadly I have a p35 board which doesn't have the dual x16 bandwith slots T_T
 
I as a concerned consumer would like to see the performance difference in this test between window XP and Vista. I know that you can access "Very HIgh" settings in XP in crysis. If you compared Crysis XP to Vista in this rig that would suffice as an initial baseline (but if you want to do more that's awesome).

Consumers must know what kind of difference to expect between XP and Vista in such a configuration!
 
If its true that DX9 scales better I'd like to see some Crysis numbers for DX9 if you guys have em...
 
I'm looking forward to these new "X" drivers to see how my X2 + single 3870 handle things.

I only have 24" monitors but I'd like to see how things go with a full 1920x1200 resolution with AA.

Keep up the great work.

Always a pleasure,

10e
 
How did it play in any other game? WiC? Lotro? Witcher? Hellgate? Portal? HL:2?

None of those titles will really stress the gpu. There is no real point in comparing a pair of gpus that both get 100+ frames.

The Crysis result makes me wonder if Crysis really is the unrefined, unoptimized bug fest some people are claiming it is. I was left with one question: I wonder how DX9 mode would impact the Crysis results? Didn't I hear some discussion from the ATI/AMD team about how, for now, DX9 titles will scale better with multi GPUs than DX10 will?

Ah. http://www.pcper.com/article.php?aid=523

As stated before, if it were Crysis' fault that when there is action the fps stalls, then the nvidia cards would be having the same issue. It seems more like an AMD issue.

However the lack of serious multi-gpu support would be Crysis' fault. However that has always been hit or miss between various games.

I did not notice any difference between Dx9 and Dx10. However I did notice a 5fps difference between 64 and 32 bit.
 
If its true that DX9 scales better I'd like to see some Crysis numbers for DX9 if you guys have em...

Here is a good question though, you are going to spend $880 on graphics hardware, should you not expect to be able to run the latest games in all the DX10 graphics glory?
 
Appreciate the review.

Disappointed that even with 4 GPU's ... Cyrisis is still a game you will not be able to fully enjoy at high-res with all settings maxed for sometime to come.

I don't plan on buying Crysis until it's 10 dollars a few years from now when there should be hardware out that actually play it with maxed out settings.
 
Here is a good question though, you are going to spend $880 on graphics hardware, should you not expect to be able to run the latest games in all the DX10 graphics glory?

You absolutely should, but if there is a bug or something thats hindering DX10 crossfire performance, but not in DX9 I'd like to see what the possible DX10 numbers could be. Just a curiousity thing.
 
Hard ocp, this is not a perfect review, i loved everything except that you draw a conclusion that this solution isnt good at all, while a GTX might be way too little gphx power in short time.

well, add up a Sli config to simular price.

and for god sake, dont judge a solution on 3 games, one of theese titles perform good on theese ati cards, infact 2 of the 3 games is gotten help from nvidia aswell.

Bring in more games, compare performance improvement on 3870 to 3870X2 etc, do it more detailed instead of making a 6 page short review while you guys surely make a longer review for something not much more exciting like 10 versions of 8800 GT or well, soon to be any 9 series GF card, which i doesnt really care about, i loved the 9600 GT from palit, its something special(worth being long)

Bring some SLI VS crossfire.
 
Back
Top