GTX 480 and 460 SLI vs. 5870 CrossFireX Redux @ [H]

I am wondering the same..... I have two Matrix 2GB 5870s in my rig driving 3x22" 1680x1050 monitors... would one card provide the same or better performance? I gotta check that out later.

There is hope at the end of the rainbow though. It seems that the last few driver releases with ATI are specifically focused on improving Crossfire. I've been playing around with hardware too long to expect any miracles. A couple percents more FPS without any perceptable image quality loss is always welcome though. :D
 
Eight (8) frames per second in BC2 is "OK"? Your must work for AMD.
haha touche

I stand by my evaluation. Individual games here and there are bad, but then you'll come across a Mafia 2 that demonstrates what the hardware is capable of. On the whole, it ends up with an "OK" performance improvement when you add another card.

I remember buying Crysis specifically to see what my badass 3870 crossfire setup could do, only to find that Crossfire offered NO improvement. Then there was a hotfix. Then there was a new driver release. Then there was another driver release. Then I switched to nvidia.
 
thanks. at least that suggest that it isn't a hardware problem if drivers can fix it
 
I have to question why Arma II ("notoriously cpu bound" right in the article) and Crysis were even in that? Crysis is a big pile of unoptimized mess.

Mafia II is a good choice though (and good game)
 
I have to question why Arma II ("notoriously cpu bound" right in the article) and Crysis were even in that? Crysis is a big pile of unoptimized mess.

Mafia II is a good choice though (and good game)


We wanted to try some different titles that were still topical in today's market. I am sorry we did not use titles you approve of.
 
Kyle- do you think there might be more to the story with Arma II since the 460's match the 480's? (since it seems like the 480's fall off almost as much as the 5870's)
 
Kyle- do you think there might be more to the story with Arma II since the 460's match the 480's? (since it seems like the 480's fall off almost as much as the 5870's)

I was wondering something along these lines. I'm wondering if you guys considered cranking the cpu up just a bit more to see where the bottleneck is.
 
I am wondering the same..... I have two Matrix 2GB 5870s in my rig driving 3x22" 1680x1050 monitors... would one card provide the same or better performance? I gotta check that out later.

Is it the same performance as a single card? Not with eye candy turned on 3x22 1680x1050 monitors. I'm sure it helps to have crossfire for that. I think running a single card solution (including the 5970) is still the smartest way to go all around , but for those of us that want all options enabled and maxed with high fps dual card solutions are still where its at.
 
Yeah on my rig (OC'd 5870) I get good experiences at 3x24", just not all the 'fruit'. I've been MM gaming for ages and the ATi sorry AMD solution is great. However CFX at MM level res is so bad I sold my 5970 for a 5870, I am never going back to CFX until AMD really improve their drivers.

But single card? Awesome :)

Agree with this. 6 months ago I tried crossfired 5870s and then a standard 5970 and gaming at 5760x1200 was poor, so I sold the extra cards. A couple weeks ago I had a Sapphire 4GB 5970 (for a few days before it became faulty and I had to return it) and eyefinity performance was a lot better. The problem I found is that you need certain drivers for certain games, for example Bad Company 2 would only run decently on 10.5a, so you're in a bad position if you play Bad Company and a newer game like Star Craft 2 where you would want a newer driver than 10.5a.

I've been driving 5760x1200 for nearly a year now on a single 5870 and for the games I've played (Bad Company 2, Mount & Blade: Warband, COD4, HAWX, Dirt 2, UT3, Far Cry 2) I have run them maxed out bar Bad Company 2 (mix of high and medium settings for online play).

If the 6870 provides a decent boost that will be my next upgrade.
 
Kyle- do you think there might be more to the story with Arma II since the 460's match the 480's? (since it seems like the 480's fall off almost as much as the 5870's)


Oh, I think there is surely something to the story. My guess is that it is CPU limited or so poorly optimized it just sucks as a game engine. This is why we do not use it anymore for regular reviews. We did however think it would be an interesting looking for this article. But I am not going to spend a lot of money figuring things out with this game.
 
Have had Sli and found that multi card config’s to be a bunch of crap, spent more time trying to make it work right than actual game play (or at least it felt like that). Am running a single 5870 now and am cool, like reading about things but doubt they will ever fix Cfx/Sli issues.
 
Not trying to be an AMD fanboy, but it truly is saddening that AMD does not have better drivers to complement such good hardware. At least in this current generation, 5870s are much cooler and a helluva lot less power-hungry than the GTX 480. Multi-GPU scaling is the thing holding back the 5800 series, since they're usually neck and neck with a single GTX 480. Then CF....

A cool idea I just had would be three-way 5850 CF, as it would probably use less power than a GTX 480 SLI but still provide better performance. But with two-way scaling already where it is, fat chance of that happening. Nice article BTW, was very informative.
 
r.e. ARMA II, note that we saw separation at 5760x1200, so at that higher triple display resolution, it was certainly more gpu limited. I tried pushing it up as high as I could at 2560x1600, every setting was at its highest value except for Visibility, and turning that up higher only brought down the fps even more, which was already well under 30 fps, and made it impossible to get my run-through done. So, basically, something is bottlenecking the game at lower resolutions for sure. At 5760x1200 though, whatever was holding it back breaks free it seems, and well you see how that turned out.
 
Kyle, Brent would it be possible for hardocp to test out the latest drivers and application profiles with the previous generation 4xxx ati cards. To see if amd really has been dropping the ball on crossfire .I have this bad feeling that every new driver release has been making things worse for 4000 series owners, especialy the crossfire people.

If you look at the latest steam hardware survey numbers these older cards are still the most owned and thus its still very relevant to keep amd on its toes to have the older card owners, who spent top notch dollars not so long ago, to still enjoy the latest games on high settings but also older games on the highest settings. I know many people with 4850 cf and 4870 cf and we dont feel like upgrading just yet and we sure feel neglected on the driver front.

steam top 2:
ATI Radeon HD 4800 Series
7.08%
NVIDIA GeForce 8800
5.86%

source : http://store.steampowered.com/hwsurvey
 
Last edited:
r.e. ARMA II, note that we saw separation at 5760x1200, so at that higher triple display resolution, it was certainly more gpu limited. I tried pushing it up as high as I could at 2560x1600, every setting was at its highest value except for Visibility, and turning that up higher only brought down the fps even more, which was already well under 30 fps, and made it impossible to get my run-through done. So, basically, something is bottlenecking the game at lower resolutions for sure. At 5760x1200 though, whatever was holding it back breaks free it seems, and well you see how that turned out.

Brent- still seems like something else odd is at work as even at the higher resolution, the 460 SLI matches the 480 SLI. The only difference is that the gap to the 5870 Xfire is a little bigger. From the chart I was looking at, it doesn't seem to be any less CPU limited than the first, except that crossfire seems to have a higher CPU overhead cost than SLI, and that the overhead cost of crossfire increases with resolution. If it was actually GPU limited, shouldn't the 480s roll away from the 460s rather than being slightly ahead, but really within the statistical variance of different runs of the test? If that test is really GPU limited, then the 480's have as much of a scaling problem in that particular game as the 5870 Crossfire. Am I looking at something wrong?
 
Id like to see ati implement an sli type bar graph to show just how well the games are scaling, would be even more interesting with eyefinity in the mix. Last i heard though reviewers have been asking for this for years and all ati done was put that crosfirex logo in to show that crossfire is actually working. :(

And for a company that has its top end as a multi gpu card they really need to get on the ball with their performance for games, you'd think ati would be leagues ahead of nvidia because of this but it seems theyre laging behind somewhat.
 
Borderlands should be thrown into the mix as well, it's another title that has slowly been dropping behind in performance since about 10.5. Considering theres a new DLC expansion due for this title soon I'm pretty sure a lot of people will be picking it back up again.
 
borderlands, fallout3/oblivion and arma II where most dissapointing for my 4870 crossfire when i added more recently the second card for 100$ :(
 
Could the first three of those be due to game engine issues? Borderlands, fallout3, and oblivion all being done by either 2k games or bethesda game studio. (well, Arma II could be an engine issue as well with the results from this article). Maybe none of them really lend themselves well to AFR or at least AMD's implementation of it?
 
Could the first three of those be due to game engine issues? Borderlands, fallout3, and oblivion all being done by either 2k games or bethesda game studio. (well, Arma II could be an engine issue as well with the results from this article). Maybe none of them really lend themselves well to AFR or at least AMD's implementation of it?

The Gamebryo engine stuff maybe but Borderlands is running on a newer build of UE3, and I've never heard UE3 being problematic when it came to multi-gpu scaling.
 
Game engine does have a lot to do with it, there is a lot game devs can do to optimize and make things work better with AFR. NV and AMD both have documentation they have released at GDC and conferences like that to game devs to help them code games better for multi-GPU, there are specific things they can do in DX to help. So yes that has a lot to do with it too, I've seen game patches that have improved multi-gpu performance.

Here is a little recent example i just found real quick, there is tons of stuff out there from amd/ati and nv - http://www.google.com/url?sa=t&sour...sg=AFQjCNFabis1Meg1FhwA09iP1pTqJFWoYg&cad=rja

AMD gives suggestions like what kinds of quarries to make and such, to optimize performance in the game. NV also has technical docs for game devs to help make SLI work better in games.

I've seen some really technical stuff in the years past, when SLI and CrossFire were newer, that really told game devs what code to use in DX to make it work better, they were very direct about what not to do and what to do, to optimize performance.
 
Last edited:
I agree. Seeing almost exact numbers on a high end and mid range cards makes me think there is something else going on. Driver optimization or something. Not sure. But I do find this article slightly misleading. Not saying CrossFireX is all that, but it's obviously not what they are showing here.
 
with fallout new vegas coming out soon, keeping an eye on fallout 3 and oblivion performance could help make some peoples buying decisions. And hopefully help AMD iron out any remaining bugs or add further optimizations for 4xxx and 5xxx cards for that specific game engine.
 
I agree. Seeing almost exact numbers on a high end and mid range cards makes me think there is something else going on. Driver optimization or something. Not sure. But I do find this article slightly misleading. Not saying CrossFireX is all that, but it's obviously not what they are showing here.

I don't really see how anything can be misleading when you show real-world results. If someone goes and picks up ARMA II to play, this is what they'll experience in the game. We are showing gamers how things currently compare in the game, its as real and relevant as it gets.
 
Thanks for article [H]. It seems quite evident that AMD have some work ahead of them. Really makes me wonder how many of the recent games are affected other than the ones you all have tested.
 
I dont understand what the big issue is here with ATI's cfx. Im looking at the numbers and for a 12month old card to still stay close to a gtx480sli is not bad. All this about ATI dropping the ball is a bit over the top.
 
I dont understand what the big issue is here with ATI's cfx. Im looking at the numbers and for a 12month old card to still stay close to a gtx480sli is not bad. All this about ATI dropping the ball is a bit over the top.

Its not the GTX 480 thats the problem. Much cheaper GTX 460's (can be had for 179.99 with a 20 dollar MIR from Gamestop for the 1GB version Galaxy version) which don't have more impressive specs are beating the 5870 CF setup (which are at least 320 dollars making them almost 150 bucks more expensive and thats with a deal) matching performance.

AMD has shown that its drivers haven't been on point recently (which is partly why they released the crossfire x application profile download) and from the benchmarks shown here theres legit proof of this.

Its uncertain whats going on for sure but from what it seems in benchmarks and real world play AMD's driver team needs to improve its driver set for crossfire applications. I think Nvidia has really struck gold with the GTX 460 , its quiet , very overclock friendly , uses less power (not a ton less though) than its 480 counter part. Its cheap as hell compared to its nearly 460 dollar brother and really brings performance to the masses in a cheaper way than AMD can currently match. However AMD has its 6000 series coming very soon and that seems like it could provide solid competition to Nvidia.

The best part about all this is we as gamers win , we get cheaper and faster products in the end.
 
Hi, I've been reading [H] for quite a while, just haven't posted yet. I thought I'd throw in my experience regarding Arma II, as I used to benchmark with that game.

First, the built in benchmark is extremely CPU-bound - especially at higher viewing distances. At 1920x1200 and 4xAA, anything faster than a 4870 was no use (CPU was a [email protected]). I spent a lot of time looking for a sequence where at least a HD5870 wasn't CPU bound (no chance with a HD5970).

Then, the game has some massive CF issues. The only way I was able to enable the second GPU under Windows 7 was to rename the .exe to crysis64.exe, and I had to force it to run in winxp mode (yes, under win7!). Crazy. Haven't had the opportunity to test SLI.

Oh, and by the way, this CF vs SLI series is just lovely - thanks [H]!
 
Nice review, but as someone who is involved in the ArmA2 community quite a bit, I'd have to say a couple things about it:

1. Really not a great game for GPU testing. You did mention how it was notoriously CPU-bound, which is true (also HDD I/O-bound) but I think it's more CPU-bound than you thought. Just looking at the results tells me that.

2. It's a game that's been fairly shown to be negatively-biased against ATi cards, especially in Crossfire.

Other than that, nice to see this. Surprising that ATi is dropping the ball on multi-screen configurations when they first adopted it.

EDIT: I just read the post right above mine and noticed that it says almost exactly the same thing. Hah! I guess that's just more proof. :cool:
 
Ati drivers..... *sigh*

Fool me once, shame on you. Fool me 1,013 times, shame on .. YOU!
 
Fascinating review.

I think you pumped the settings at 2560x1600 and 5760x1200 for Crysis and Arma 2 too high for the 1gb cards. I believe that rather than show scaling per se you have instead shown that:

A) The GTX460 handles an inadequately sized framebuffer (for the scenarios tested) better than the 5870 does,

B) Theres something funky going on with Arma 2; a limiting factor that isn't the gpu. The settings are too high for 1gb per gpu, yes, but if that restriction was suddenly lifted away there would still be a very tight grouping of results from 3 very different gpu setups.

I appreciate that 2gb 5870s are disgustingly expensive, and 2gb 460s are not even available in e-tail (or not?), and that you dont have infinite funds or time to explore every tangent of thought.

Sorry this is brief and may seem a bit dismissive, I had written something much larger and more eloquent but lost it stupidly refreshing the page :/
 
We know ATI is saying quote " MY BAD " for all bad driver releases , but hey we got 3 month more with these suckers , lets have some more positive driver fixes :p
 
I wonder the fact that a CPU bound game like ARMA 2 is performing worse on CFX because CFX uses more CPU overhead than SLI.

Given the fact that triple display resolutions shows no gap between SLI 480 and SLI 460 it is pretty clear to me that at these settings the game is NOT GPU bound in any matter and the decreasing we are seeing in performance simply means higher CPU usage at triple display resolutions.
 
ATI/AMD's drivers are of such poor quality that I have no doubt the main limiting factor is is not the hardware.
 
480 sli and 460 sli seem so close together. Driver issue?

More like game issue. Its unusual for sure but thats how that game is.

Could it be that programmers have been doing too much console development and implementing FPS caps (even on PC exclusives) the same way that consoles have?

This is nothing new to PCs, but sometimes turning V-sync off may not necessarily always let the GPU render the max FPS possible. Then again having to hack a config file (which I'm sure is commonplace for almost every user who posts here) can't possibly be good for standardizing results for a benchmark comparison. The results wouldn't reflect that of a retail bought/installed situation of a casual user.
 
5870 has the same playing performance as 480 except the crappy arma game, which 480 sli = 460 sli
what's with those whinners about the driver? do they just repeat the same BS like a parrot without using their brain?
 
Back
Top