Vega's SLI Scaling Thread

i think 4.8ghz on the cpu is way too little for quad sli. bump it to 5.5ghz or more and you'll see a trumendous difference. you should cherry pick a cpu that can do it with acceptable voltage

I have not seen any Gulftown on the net that can do 5.5Ghz stable for gaming and torture tests on anything less than 1.59 volts. Remember, there is a huge difference between just loading Windows to take a CPU-Z screen shot and actually having the machine 100% stable under full load.

Do you mean no new information was loaded for this whole test? for exampel, you loaded the level up and you only looked at one place the whole time?

Don't really get your wording on it

This was to get actual in game FPS and to not rely on "benchmarks". Single player game was loaded up, character dumped into the world and not moved from that spot to have testing consistency. These values do not reflect the maximum, average nor minimum numbers in the game. They are just a baseline reference point to keep all things equal for testing.

I will not knock this build, by all measures it is "Elite"

But the problem for us mortals is that the price/performance ratio for any of the 580GTX multi-monitors setups is to far out of wack to truely consider. The only time the price/performance scale falls in Nvidia's favor is in 3x30" Quad Card setups because at that point ATI has nothing that comes close to the performance of four 3GB 580GTX's. For ANY other price/performance situation, ATI's either come out the clear winner or the lines blur to the point that price creates the true winner.

Here's my point.

Taking the current Vid Card Prices: $630 for a 3GB 580GTX, $700 for a 6990, and $325 for a 6970.

$1250 will net you two 580GTX 3GB's, or one 6990 and one 6970 + $225. ATI clear performance win.
$1900 will net you three 580GTX 3 GB's or Two 6990's and $500. Toss up in performance but $500 in your pocket with ATI, giving the final nod to ATI.
But at $2500 you will have 4 580GTX 3GB's and there is nothing in the ATI arsenal to come close so winner has to be Nvidia by default.

Yes, it is all relative. You have to take the resolution being used into account. Especially if you are not running 3x 30", the favor leans toward AMD. For your $1900 price point, it would be close but I would still give the nod to the 3x 3GB 580s versus 2x 6990s. Of course at a higher cost.

If say I was doing non-3D 3x 1080P screens, I would most likely get 3x 6970s and a 2600K and call it a day. The issue there though is AMD screwed the pooch on the display connectors so you run into VSync screen tear issues with mixed display types. This forces you into the 4x DP 6990. I also hate DP as it cannot do long runs nearly as well as DVI-D can. I like to keep my computer far away from me even though it is nearly silent. ;)
 
I have not seen any Gulftown on the net that can do 5.5Ghz stable for gaming and torture tests on anything less than 1.59 volts. Remember, there is a huge difference between just loading Windows to take a CPU-Z screen shot and actually having the machine 100% stable under full load.



This was to get actual in game FPS and to not rely on "benchmarks". Single player game was loaded up, character dumped into the world and not moved from that spot to have testing consistency. These values do not reflect the maximum, average nor minimum numbers in the game. They are just a baseline reference point to keep all things equal for testing.



Yes, it is all relative. You have to take the resolution being used into account. Especially if you are not running 3x 30", the favor leans toward AMD. For your $1900 price point, it would be close but I would still give the nod to the 3x 3GB 580s versus 2x 6990s. Of course at a higher cost.

If say I was doing non-3D 3x 1080P screens, I would most likely get 3x 6970s and a 2600K and call it a day. The issue there though is AMD screwed the pooch on the display connectors so you run into VSync screen tear issues with mixed display types. This forces you into the 4x DP 6990. I also hate DP as it cannot do long runs nearly as well as DVI-D can. I like to keep my computer far away from me even though it is nearly silent. ;)

how do I say, "bingo," to all three points? :p
 
This is totally true however the real point of 580's and 6970's is for multi mon goodness. There really is little difference between the high end cards and one step down at what I will refer to as "mortal" resolutions (I.E. a single screen 24" or less). So NV did screw the pooch on multi-mons with that 1.5GB.

For 5760x1200, 1.5gb on my GTX 480's is perfect. Then again, I prefer to try and keep my fps above 60 for the hardest scenes to render, like heavy smoke etc... If you are willing to deal with lower fps and boost the AA, the vram will become limited. Since most people are going 5760x1080, I would imagine 1.5gb would be ideal, its only really when you have 3x27-30 monitors with higher resolutions that the 2-3gb vram becomes necessary.
 
i think 4.8ghz on the cpu is way too little for quad sli. bump it to 5.5ghz or more and you'll see a trumendous difference. you should cherry pick a cpu that can do it with acceptable voltage

um, i dunno about that - i doubt he will see any difference at all since he is running quad sli @ 12MP; there shouldn't remotely be a cpu bottleneck at that resolution.
 
For 5760x1200, 1.5gb on my GTX 480's is perfect. Then again, I prefer to try and keep my fps above 60 for the hardest scenes to render, like heavy smoke etc... If you are willing to deal with lower fps and boost the AA, the vram will become limited. Since most people are going 5760x1080, I would imagine 1.5gb would be ideal, its only really when you have 3x27-30 monitors with higher resolutions that the 2-3gb vram becomes necessary.

It all depends on the type of games you plan on playing.

I built my system mainly for sims such as DCS A-10C, and the ARMA2 series. Those two titles alone at 5760x1200 + bezel correction easily go over 2GB when you start antialiasing. Due to the nature of those games you need to use high AA as well. I seriously considered going AMD, but I'd be running out of VRAM nearly immediately. With ARMA2 I was reaching 2.9GB, and A-10C around 2.5GB.

If you're just running crappy DX9 Xbox360 console ports, 3GB VRAM won't make much difference.
 
Last edited:
Lost Planet
A-10C
Batman
Metro2033.

In all those games, Tri-SLI is ALREADY over 60 fps. The 4th card is useless. It's half the game in that ''review''.

Where's the misinformation in this? 8000fps is not better then 60 fps when palying a game. That's the reality.

Stop the personnal attacks. It's getting old. Remember the other thread where I was right in the end, even after all the attacks?

Explain to me why Quad-SLI is better then Tri-SLi in those game ALREADY at 60fps or more. So around 1000$ more for 3 games.

Stop beeing Vega's cheerleaders.

All the personnal attacks will be reported to the moderators. Answer the question of my post, not the messsanger.
 
Bottom line, the 4th card scales extremely well. What about situations that don't run at 60 FPS with 3 cards? What about S3D? S3D 3x1920x1080 is more demanding than even 3x2560x1600. And please stop trolling threads as the mods are aware of it and have already warned you about it.
 
Last edited:
Sounds like it's good that it gets over 60fps in today's games. Since we can show that it scales well, it will be good in tomorrow's games as well. Awesome build, that cooling setup it pretty boss!
 
Lost Planet
A-10C
Batman
Metro2033.

In all those games, Tri-SLI is ALREADY over 60 fps. The 4th card is useless. It's half the game in that ''review''.

Where's the misinformation in this? 8000fps is not better then 60 fps when palying a game. That's the reality.

Stop the personnal attacks. It's getting old. Remember the other thread where I was right in the end, even after all the attacks?

Explain to me why Quad-SLI is better then Tri-SLi in those game ALREADY at 60fps or more. So around 1000$ more for 3 games.

Stop beeing Vega's cheerleaders.

All the personnal attacks will be reported to the moderators. Answer the question of my post, not the messsanger.

dude, you really need to chill out. this thread isn't about whether you feel if quad sli is worthwhile over tri sli (not to mention vanilla sli is some cases) depending on a particular game or set of games. the scaling is excellent, period. you say the fourth card is useless in those games since it is already over 60 fps. how do you know that isn't just the average? what about min fps? that comes into play as well. what about achieving higher AA settings with the extra gpu? are all the games tested at maximum graphical settings? furthermore, this test only samples a handful of games - there may be others that may also benefit from quad over tri as well. what if one enables stereo 3d as well? that extra gpu will be invaluable in that situation. even if it was just a handful of games that showed a noticeable improvement from tri to quad, that may be important to vega or any other similar user who happens to play those games. if i played AvP a bunch, i would appreciate the 40% increase going from 40 to 56 fps @ 12MP with an extra gpu. so it is merely a matter of opinion - it may not be worth it to you to have a quad sli setup, but it may definitely be worthwhile for someone else. price to performance is a whole other topic altogether. no need to let fanboyism cloud one's judgement and prevent oneself from extolling praise where such praise is deserved.
 
Last edited:
Lost Planet
A-10C
Batman
Metro2033.

In all those games, Tri-SLI is ALREADY over 60 fps. The 4th card is useless. It's half the game in that ''review''.

Where's the misinformation in this? 8000fps is not better then 60 fps when palying a game. That's the reality.

Stop the personnal attacks. It's getting old. Remember the other thread where I was right in the end, even after all the attacks?

Explain to me why Quad-SLI is better then Tri-SLi in those game ALREADY at 60fps or more. So around 1000$ more for 3 games.

Stop beeing Vega's cheerleaders.

All the personnal attacks will be reported to the moderators. Answer the question of my post, not the messsanger.

So when AMD cards are displaying over 60 fps it's a victory, but when an Nivida card does it it's considered useless? Having it both ways must be nice! :p

As kumquat mentioned already, trying to achieve a minimum of 60 fps across various games at that enormous of a resolution with maxed out graphics settings is an accomplishment in and of itself. Vega already showed the shortcomings of no less than FOUR 6970's, and not just in the area of performance too. Do you really think another foray into that setup with newer drivers is going to change much?

The only person who needs to be reported is you. You're the only one that draws attention to yourself by making blind statements like "the 4th card is useless" when clearly the results show otherwise. You also don't seem to understand that at this high of a resolution, the gains are obviously going to be smaller in comparison to a single monitor setup like how you're running. If you really think you're a messenger for truth and objectivity then you really need to get over yourself as you do not demonstrate that you even have a grasp on the topic at hand. :rolleyes:
 
Lost Planet
A-10C
Batman
Metro2033.

In all those games, Tri-SLI is ALREADY over 60 fps. The 4th card is useless. It's half the game in that ''review''.

Where's the misinformation in this? 8000fps is not better then 60 fps when palying a game. That's the reality.

Stop the personnal attacks. It's getting old. Remember the other thread where I was right in the end, even after all the attacks?

Explain to me why Quad-SLI is better then Tri-SLi in those game ALREADY at 60fps or more. So around 1000$ more for 3 games.

Stop beeing Vega's cheerleaders.

All the personnal attacks will be reported to the moderators. Answer the question of my post, not the messsanger.
What you said is that the 4th card was useless. This is patently false as the scaling from 3 to 4 cards was just as good as the scaling from 2 to 3 cards. This proves the card is being used and therefore is useful - is it NECESSARY in all of his games? Maybe not, but it's certainly being utilized.
 
What you said is that the 4th card was useless. This is patently false as the scaling from 3 to 4 cards was just as good as the scaling from 2 to 3 cards. This proves the card is being used and therefore is useful - is it NECESSARY in all of his games? Maybe not, but it's certainly being utilized.

You know I figured when Vega started this project having that 4th one would be worthless. That was just based off of what people were getting with quad SLI before, where the 4th was giving anywhere between 5-10% extra performance. This proves that Nvidia has not been sleeping on the ultra high end crowd and came out with some damn good performance. Nvidia gets a +1 in this case with their recent blunder with DA2 (regardless if you find the game relevant or not, I don't and still think NV could of been faster at releasing better drivers).
 
It all depends on the type of games you plan on playing.

I built my system mainly for sims such as DCS A-10C, and the ARMA2 series. Those two titles alone at 5760x1200 + bezel correction easily go over 2GB when you start antialiasing. Due to the nature of those games you need to use high AA as well. I seriously considered going AMD, but I'd be running out of VRAM nearly immediately. With ARMA2 I was reaching 2.9GB, and A-10C around 2.5GB.

If you're just running crappy DX9 Xbox360 console ports, 3GB VRAM won't make much difference.

Yes, some titles go crazy for VRAM, A-10C is one of them. I remember running into issues with the multi-6970 setup. There is no way to monitor VRAM usage with AMD cards which is annoying and you can only guess if you are hitting VRAM limit. I could only imagine future games like Battlefield 3 using tons of VRAM.

You know I figured when Vega started this project having that 4th one would be worthless. That was just based off of what people were getting with quad SLI before, where the 4th was giving anywhere between 5-10% extra performance. This proves that Nvidia has not been sleeping on the ultra high end crowd and came out with some damn good performance. Nvidia gets a +1 in this case with their recent blunder with DA2 (regardless if you find the game relevant or not, I don't and still think NV could of been faster at releasing better drivers).

I also noticed different driver versions can have a huge impact on the performance differences on something as "quirky" as Quad-SLI on 3x 30" in portrait. I remember when I first started the tests I was using an earlier nVidia driver and the 4th was actually slower in most titles over 3! I was like oh crap; everyone was right! I was a bit pissed off! ;)

Then I tried a different driver version and everything was scaling properly again, but portrait bezel correction was broken. Although I still do not think Crysis 2 is working properly with SLI, even on 270.51. I know a lot of people do not care for Crysis 2, but it does look pretty cool @ 12+ mega-pixels.

And do not be fooled by some of the posted numbers. They are reference numbers to show scaling differences, not the greatest demands for the individual games. An example would be the 120 FPS under Quad-SLI Metro2033 number. In that particular scene it might be 120FPS, but you go around the corner and get into a fight and it might drop down to 60 FPS. The thing is; Quad-SLI helps a lot with minimum FPS in which SLI and Tri-SLI would start to stumble on under such high stress.
 
I just tried that, nothing on my HD6870's GPU memory... will try with my HD6990 tommorrow.
 
I'm a diehard AMD fan, But i can give much respect to the setup Vega has And i will most likely never purchase a Nvidia card but i don't see where all the animosity comes from on both sides. Don't hate, Appreciate. Even for things you can't afford like Vegas rig.
 
I'm a diehard AMD fan, But i can give much respect to the setup Vega has And i will most likely never purchase a Nvidia card but i don't see where all the animosity comes from on both sides. Don't hate, Appreciate. Even for things you can't afford like Vegas rig.

I'm a die-hard best-card-for-my-money fan, which has meant Nvidia for my past three cards (8800GTS 640MB, GTX260 C216, GTX570 SC), but AMD/ATi for the two cards before that (Radeon 9700, Radeon X800 Pro). For this generation though, like many including Vega above have said, AMD has it in $/performance when scaling is concerned, up until you run out of VRAM.

In my case, at 1x30" so 4MP, I found my GTX570 to be lacking at 1280MB. For the price, the best I can do is an HD6950 2GB, considering that I'll be adding a second card to complement the first when I upgrade my core components to Sandy Bridge.
 
This is the absolute, unequivocated top - end maximum resolution config in gaming today. Every other nearly capable build is simply exceeded in physical capability to accomplish what our dear vega accomplishes here. Why it is, in fact, so damn impressive.

I see similar scaling up to 5x1 , almost across the board 99% usage in every title tested. Quad gpu scaling is no longer the frivolous waste of yester-years passed. 12 megapixels..let me revise: 10 megapixels and above ftw!
 
LEVESQUE said:
Where's the misinformation in this? 8000fps is not better then 60 fps when palying a game. That's the reality. Explain to me why Quad-SLI is better then Tri-SLi in those game ALREADY at 60fps or more. So around 1000$ more for 3 games.

Hey Levesque, I'll try and answer your question. There are a lot of new monitors out there that support 120Hz. Its arguable whether the human-eye can detect the difference in smoothness between 60Hz and 120Hz. Especially in TVs a lot of people tend to lose the double-blind tests. However, reaching 120Hz might make a bit of a difference in smoothness for some people.

In addition, a lot of those people with 120Hz monitors will use them for '3D' in which 120Hz will feel like 60fps because you are generating twice as many frames but each eye only sees 1 frame. As a result, 120Hz can be important for 3D. I dont know if Vega plans to use 3D in the near future, but, if he choose too, he has the luxury/power in his PC.

Keep in mind too, his PC is at least future-proof. When the GTX680 comes out, I strongly suspect, his quad 580 will be about equivilent to SLI GTX680s which a lot of people on this board would feel 'Lucky' to have. Heck, a lot of people with 5890s + 5870 still feel relatively happy against 6970s to the point they feel its not worth the money to upgrade.

His PC although is getting perhaps 120Hz which may arguably be 60Hz more than he needs, the future-proofing of those numbers means that games coming out this year like Battlefield 3, Deus Ex: Human revolution(probably a weak console port but meh), Brink, etc etc will all do >60fps. In addition, probably a lot of games next year and the year after will do > 60fps.

As a result, he may have invested $2500 in a PC's graphics cards this year, but he should be good for three years. Is there any different than those who spend $1000.00 a year, every year to replace their 2x5870 with 2x6970s or there 2x480s with 2x580s? In the long run, it might be cheaper for him.

The real tragedy is that poor Vegas is going to have the cops visiting his house every few weeks to try to find his grow-op they'll assume he must be running in his basement with his power usage levels.
 
Thank you WBurchnall.

I did calibrate some HP ZR30W with my i1 Pro, and those LCD Vega is using are 60Hz. And they are really bad OOTB. You need to calibrate them fully to appreciate what they can really do. Calibrated, those are really nice 30'' LCDs. Some of the best out there after a good calibration.

I don't understand the ''future-proofing'' thing, since AMD 7xxx serie and Nvidia are just around the corner. With those LCDs, Tri-SLI is enough for 60fps, except for 3 games. And about all the new games coming, Keppler and AMD 7xxx serie will be out for those. So...

I just can't understand paying 1000$ more (and probably alot more then 1000$ for climatisation, watercooling parts, etc) for a 4th card, when it wil be obsolete in 6-8 months when AMD 7xxx serie and Nvidia Keppler will be out. I could be wrong, but knowing Vega, he will upgrade the first minute those will be out. Remember those words in 6-8 months. :) Hey. I will do it the first minute they will be out (Keppler or AMD 7xxx), and he's like me. So he will upgrade fast.

He has a nice set-up, I totally agree. But I'm trying to analyze this logically. And I don't see any numbers warranting the use of the 4th card in here. Tri-SLI would have done the same job.

My opinion only. And I'm entitled to it.
 
Would be nice if single GPU was also benched.

Also..."Single player static entry" could possibly be too kind to multi GPU setups, because of the small CPU utilization, and because its oversimplified rendering case.
 
Would be nice if single GPU was also benched.

Also..."Single player static entry" could possibly be too kind to multi GPU setups, because of the small CPU utilization.

As others have said can't run a single nVidia GPU with multi-monitor.
 
good read too start of the day. nicely done

on a side note why does everyone think battlefield 3 is gonna be this big break through game,(don't get me wrong I played bf2 from the start for along time) so far after watching all the videos it almost looks the same to MOH mp just few extras. I have this big feeling that FB2 is just FB 1.21
 
Sheer awesomeness. I want Quad-SLI *checks wallet* No cash... *grabs his 9mm* Okay, I'm going to the bank now.
 
:D
good read too start of the day. nicely done

on a side note why does everyone think battlefield 3 is gonna be this big break through game,(don't get me wrong I played bf2 from the start for along time) so far after watching all the videos it almost looks the same to MOH mp just few extras. I have this big feeling that FB2 is just FB 1.21

BF3 is going to have 64 player maps that are the largest that Dice has ever produced. Add that plus really nice graphics and there is no way this game isn't going to use tons of VRAM.

I guess I'll just bring Ma Deuce with me, then.:D Then I'd just have to find a place that has 3gb GTX580's.

Amazon has them, so better get to the robbing. I think it would be easier to rob the Amazon warehouse. :D
 
Hey Levesque, I'll try and answer your question. There are a lot of new monitors out there that support 120Hz. Its arguable whether the human-eye can detect the difference in smoothness between 60Hz and 120Hz. Especially in TVs a lot of people tend to lose the double-blind tests. However, reaching 120Hz might make a bit of a difference in smoothness for some people.

In addition, a lot of those people with 120Hz monitors will use them for '3D' in which 120Hz will feel like 60fps because you are generating twice as many frames but each eye only sees 1 frame. As a result, 120Hz can be important for 3D. I dont know if Vega plans to use 3D in the near future, but, if he choose too, he has the luxury/power in his PC.

Keep in mind too, his PC is at least future-proof. When the GTX680 comes out, I strongly suspect, his quad 580 will be about equivilent to SLI GTX680s which a lot of people on this board would feel 'Lucky' to have. Heck, a lot of people with 5890s + 5870 still feel relatively happy against 6970s to the point they feel its not worth the money to upgrade.

His PC although is getting perhaps 120Hz which may arguably be 60Hz more than he needs, the future-proofing of those numbers means that games coming out this year like Battlefield 3, Deus Ex: Human revolution(probably a weak console port but meh), Brink, etc etc will all do >60fps. In addition, probably a lot of games next year and the year after will do > 60fps.

As a result, he may have invested $2500 in a PC's graphics cards this year, but he should be good for three years. Is there any different than those who spend $1000.00 a year, every year to replace their 2x5870 with 2x6970s or there 2x480s with 2x580s? In the long run, it might be cheaper for him.

The real tragedy is that poor Vegas is going to have the cops visiting his house every few weeks to try to find his grow-op they'll assume he must be running in his basement with his power usage levels.

Man i hate 120hz it looks like everything is in fast forward, i can make out the difference very clearly. ill stick to the panasonic vierra plasma and my 60hz monitors, the 120 looks good it just bothers me.
 
Man i hate 120hz it looks like everything is in fast forward, i can make out the difference very clearly. ill stick to the panasonic vierra plasma and my 60hz monitors, the 120 looks good it just bothers me.

Probably something that takes more getting used to. Your hate for 120Hz is more towards Monitors than TVs I suppose? The TVs, its artifically generating 'new' frames so its ont really 120Hz at all. Its more 60 frames with each frame being shown twice or a mathematic combination of frames 1 and 3 merged together inbetween 1/3.

I wonder if you felt that way moving from regular tv to digital cable/hdtv or dvd to blu-ray? Remember those old CRTs usually were displaying ntsc signals that were 24fps versus 60fps on some digital content. Did everything there feel like it was on 'fast forward' mode also? I know for me, I can tell the difference in smoothness easily from 30 to 60fps(Ie playing on my laptop versus desktop).
 
I am going to re-run these benchmarks in Quad-SLI but change the CPU to 4.0Ghz and 5.2Ghz to see if it changes the results much. I am quite curious. ;)
 
Vega: would you be able to test to see if Quad-SLI works with Fallout: New Vegas? Please! :)
 
Well, I finished up the CPU speed difference benchmarking. I was going to make a bar graph but it would not be very interesting. I tested all of the same benchmarks but this time with the 990x @ 4.0 Ghz and it @ 5.2 Ghz. 4.83Ghz being the baseline.

On average, dropping down to 4.0Ghz lowered FPS by 10-12%. Increasing the CPU to 5.2Ghz had a minimal impact, averaging a 1-2% gain. One game, F1 2010, didn't care what the CPU was set at and pulled the same numbers.

Considering I can run 4.83 Ghz full IBT stable at a relatively comfortable 1.45vcore, I think it would be pretty insane to increase vcore to 1.56 just to get a 1-2% FPS increase. 4.83 Ghz (210Mhz bclock) 990x @ 1.45vcore and 1020Mhz core GTX580s @ 1.15v seem to be a pretty balanced 24/7 high performance setup.
 
Vega: would you be able to test to see if Quad-SLI works with Fallout: New Vegas? Please! :)

Sorry I do not have that game. Is that the MMORPG?

Vega are you millionaire? :p

Naw, this stuff really isn't that expensive when you consider what a lot of people spend on cars. Just the engine for the car I am building is more expensive then every component remotely associated with my computer setup.
 
Back
Top