Triple-GPU Scaling: AMD CrossFire Vs. Nvidia SLI

Wow... 1,680x1,050 3-way scaling lol. Anyone dropping in that kind of cash for cards and still on a single 1,680x1,050 monitor has mixed up priorities. 2560x1600 is valuable data though. Thanks for the link.
 
Kind of a "meh" article. The only res worth looking at was the 2560x1600 and at that resolution the 570 was out of mem. It's silly to run Tri-Fire or Tri-SLI on anything less than 4MP. What's the point of looking at 1680x1050 or 1920x1080 - it's just wasting money for a 3rd card at that res.
 
Stupid article. Like others have stated, 1680x1050 and 1920x1080 resolutions are worthless tests for three GPUs. The 570s where crippled in many tests by the lack of Vid mem. Anyone who buys a 30" would buy a card with more video memory on the nVidia side.

Although I do hope nVidia wakes up and realizes this stupidly small amount of VRAM on their reference cards is lame and fixes that with their next series.
 
If (when?) Nvidia or a decent board partner (c'mon EVGA!) introduces a 2.5GB 570, I'm in for SLI. If they think it'll cannibalize their sales of the Palit 560ti 2GB and the 580 3GB, I think they're out to lunch. I don't want Palit, I want something with a AAA warranty. I want more than the 560 and the 580 is a terrible value proposition regardless of how much VRAM it brings to the table. I won't buy either in the current market, nevermind a market with a 570 sporting a decent amount of memory. Double the VRAM on the 570 and it's winner, winner Sheen dinner TIGER BLOOD.
 
Last edited:
Why the hell were they comparing the 6950 with the 570? Yes I know it makes the AMD cards look even more awesome since it's a handicap, and you can unlock a lot of those cards, but why didn't they go for the 6970 that's priced about the same and performs about the same?
 
If (when?) Nvidia or a decent board partner (c'mon EVGA!) introduces a 2.5GB 570, I'm in for SLI. If they think it'll cannibalize their sales of the Palit 560ti 2GB and the 580 3GB, I think they're out to lunch. I don't want Palit, I want something with a AAA warranty. I want more than the 560 and the 580 is a terrible value proposition regardless of how much VRAM it brings to the table. I won't buy either in the current market, nevermind a market with a 570 sporting a decent amount of memory. Double the VRAM on the 570 and it's winner, winner Sheen dinner TIGER BLOOD.

lol @ the Sheen reference
 
Nearly 250% scaling is pretty awesome specially with how much cheaper the 6950s are.

P.S: Kinda funny seeing people complaining about a review that exposes Nvidia's memory limitations, yet I didn't see any complaining when the Nvidia cards had the framebuffer advantage over AMD cards lol.
 
I just like the fact that the amd cards scaled so well in general.
 
Yeah, except for a few cases of no scaling at all, it really seems AMD has gotten their stuff together when it comes to CFX scaling, both with 2x, 3x and 4x GPUs. Very impressive.
 
2560X1600. Tri-6950 totally raping Tri-570.

Can't believe people are buying those 580 (vanilla 1.5GB) and 570 to play at 2560X1600 or higher resolutions. Not enough VRAM. I think we can clearly see it here.

So the card that cost the most $ is better at ''lower'' resolutions. LOL.
 
I'm thinking about going trifire (gaming on 1600p ZR30w). 6990 with one of my unlocked 6950's and selling the other one... Quick question though, do you guys think my XFX BE 850w psu can handle that? I just got it last month for my 6950's and it would be lame if it wasn't enough juice for trifire...
 
Last edited:
Why the hell were they comparing the 6950 with the 570? Yes I know it makes the AMD cards look even more awesome since it's a handicap, and you can unlock a lot of those cards, but why didn't they go for the 6970 that's priced about the same and performs about the same?

cheapest cards that can go tri, actually.
 
2560X1600. Tri-6950 totally raping Tri-570.

Can't believe people are buying those 580 (vanilla 1.5GB) and 570 to play at 2560X1600 or higher resolutions. Not enough VRAM. I think we can clearly see it here.

So the card that cost the most $ is better at ''lower'' resolutions. LOL.

The 3GB 580s didn't come out for months after the stock cards and I bought three at launch. I did some a fait amount of scaling test when I had 3 480s with JC2 and Crysis in 3D however and at least in 3D I got close to 100% scaling. I was running with no or low AA as you can't go over 2xAA in 3D now anyway and I think that you're right that the VRAM is an issue there. I would have thought that it would be an issue with 3D as well but it doesn't appear to be the case with what I've seen.

At any rate I had thought about going to 3GB cards but they are hard to find and since I'm already 4 months into these cards I'm just saving my money and waiting to an total next gen next gen build with Ivy Bridge and Kepler.
 
Why the hell were they comparing the 6950 with the 570? Yes I know it makes the AMD cards look even more awesome since it's a handicap, and you can unlock a lot of those cards, but why didn't they go for the 6970 that's priced about the same and performs about the same?

Like the above guy said. They wanted to show the cheapest route you could go from both sides as far as triple xfire/sli you could go which is the 6950 and GTX 570. Just proves you don't have to pay out of your ass to get an efficient triple card setup.
 
Last edited:
What I got from that article was how far CFX has come since the last few generations. Getting this kind of scaling is just incredible. What makes it even better, is how efficient the ATI video cards are as the electricity savings get compounded as you add more GPU's. Especially with the 6950's selling for $60-100 cheaper than the GTX 570's, this is an incredible bargain that makes the choice become pretty clear for folks who actually need this kind of graphics firepower. This article almost makes me want to upgrade the 768p HDTV that my gaming rig is currently hooked up to. I have promised myself that I will not upgrade the HDTV until the next Xbox comes out, since that is what I do most of my gaming on.
 
Like the above guy said. They wanted to show the cheapest route you could go from both sides as far as triple xfire/sli you could go which is the 6950 and GTX 570. Just proves you don't have to pay out of your ass to get an efficient triple card setup.

That metric is completely ridiculous in light of the availability of a comparable card though, then again for the most part so are single-screen triple card setups. :eek:
 
Kind of a "meh" article. The only res worth looking at was the 2560x1600 and at that resolution the 570 was out of mem. It's silly to run Tri-Fire or Tri-SLI on anything less than 4MP. What's the point of looking at 1680x1050 or 1920x1080 - it's just wasting money for a 3rd card at that res.

I think majority of gamers don't have 30'' 2560x1600 monitors...Gotta spread out the love for everyone:)

EDIT: Yeah, typo...
 
Last edited:
I dont think majority of gamers don't have 30'' 2560x1600 monitors...Gotta spread out the love for everyone:)

Double negative!

divided%2Bby%2Bzero.jpg


:p;)
 
I think majority of gamers don't have 30'' 2560x1600 monitors...Gotta spread out the love for everyone:)

EDIT: Yeah, typo...

The point was Tri-X is useless for 1 monitor. You would only use it for multi monitor setups and they only tested one res that was useful to look at.
 
I run xfire 6970's on one 1920*1200 display.

Well I bought the cards to run eyefinity which is next for me in the next month I will be getting two more LCDs.
 
The point was to finally put the nail on the coffin about all those Nvidia fanboys telling everyone that ''SLI scale better then Crossfire'' all over the internet multi-verse, like it's a fact.

And now we know it's the opposite. The AMD 6xxx serie scale better then Nvidia Fermi. Fact.

Spread the word! Fight the green FUD! (just kidding).
 
The point was to finally put the nail on the coffin about all those Nvidia fanboys telling everyone that ''SLI scale better then Crossfire'' all over the internet multi-verse, like it's a fact.

And now we know it's the opposite. The AMD 6xxx serie scale better then Nvidia Fermi. Fact.

Spread the word! Fight the green FUD! (just kidding).

I wouldn't make that proclamation just yet based off of one review. When the GTX590 gets reviewed against the 6990 all over the net here in a week, I think the picture will be more clear.

Although I am expecting the GTX590 to be just as loud and hot as the 6990l
 
And now we know it's the opposite. The AMD 6xxx serie scale better then Nvidia Fermi. Fact.

Not really a fact as if this is an issue with VRAM which seems to be the case then you have to look at cards with more than reference amounts of memory. It's not the Fermi GPU that's the issue but VRAM.
 
Not really a fact as if this is an issue with VRAM which seems to be the case then you have to look at cards with more than reference amounts of memory. It's not the Fermi GPU that's the issue but VRAM.

Same thing. If there is a flaw/issue in Nvidia design, and it's not scaling well, then it's not scaling well. That's all. So it's a fact.

They should have been less greedy and put more VRAM on their cards, just like AMD did. And the non-reference cards with 3Gb are impossible to find/buy, made in Europe, and in crazy limited amount. You can see a Palit 3GB on Newegg once every full-moon, or even less often. So... Do these non-reference cards really matters in the end? They made 20 for the whole planet. So. Not enough VRAM, doesn't scale well. Fact.

The 590 will also be limited in VRAM, so it will have the same flaw/issue and won't scale well enough to beat the 6990. Palit or some other obscur manufacturers will probably make a non-reference 590 design with 3Gb usable, and then put 5 on the market for the whole planet. Meh.
 
well I have these 2x 6970's sitting here, guess time to play with them. Anyone interested in some black ops GTX 580's.
 
Same thing. If there is a flaw/issue in Nvidia design, and it's not scaling well, then it's not scaling well. That's all. So it's a fact.

You said "Fermi" doesn't scale well which would implay that GPU itself doesn't scale well. And there are 580 cards on the market and more comming that have 3GB VRAM so you could say that reference cards under higher settings don't scale well but you simply just can't say Fermi. Not the same thing at all.
 
Ok. Thank you for the semantics 101 short course.

Here it is. The reference vanilla version of the Nvidia 570/580 is not scaling well, since Nvidia were too greedy and sold a 500$ high-end without enough VRAM. So if you want your Nvidia 570/580 to scale well, buy a non-reference 3GB from Palit in England or Europe, or from Newegg, if you are lucky enough to snatch one of the 3-4 cards they make every months. Oups, I almost forgot. Gainward were also talking about a 3Gb card 3-4 months ago, but we still hear the crickets chirping since then. Probably coming *soon* in Europe.

Since Nvidia realized they totally fuck*ed up, and that AMD had a bright idea going with 2Gb from day 1, Nvidia are panicking, since they dont look good selling less VRAM for almost double the price, and all those reviews starting to come out with the ''bad scaling compared to AMD 6xxx serie'', so they took the Bat-green-phone and called their good friends at eVGA. So those just announced they will, in a couple of months, also release a non-reference 3Gb version, to save the day for their friends at Nvidia!

Is it better now?
 
You said "Fermi" doesn't scale well which would implay that GPU itself doesn't scale well. And there are 580 cards on the market and more comming that have 3GB VRAM so you could say that reference cards under higher settings don't scale well but you simply just can't say Fermi. Not the same thing at all.

It is amusing that the nv boys keep whining about being treated unfairly because the memory size, but fail to mention 570 has 320bit bus width vs 6950's 256bit.

It was the design decision by nvidia to use some non-standard aggressive memory bus width to make up for the inefficient core design, it was their decision to make, it was their decision causing the memory limitation, so there is nothing wrong saying that the design of Fermi limits its scalability. Just get over it.
 
It was the design decision by nvidia to use some non-standard aggressive memory bus width to make up for the inefficient core design, it was their decision to make, it was their decision causing the memory limitation, so there is nothing wrong saying that the design of Fermi limits its scalability. Just get over it.

Nothing to get over really. There's nothing in the Fermi design that limits using more memory otherwise there wouldn't be 3GB 580s for instance. Plus even if you look at these tests the 69xx isn't always the best at scaling. And the main reason I went with 580s over 6900s is 3D support.

It's easy to look at a complex subject such as GPU scaling, take for tests that involve low resolutions, i.e. single monitor and say "nVidia Sux" or "AMD Sux". It's just a lot more to it than than.
 
Back
Top