3 Way SLI - MultiCard setups Bah Humbug

Thats pretty bad ass with the liquid cooling but I couldn't help but notice how close the nozel is on the bottom most card, is it sticking out the bottom of the case?

Those water blocks are kind of a pain in regard to routing tubes. I didn't like messing with that in my rig. When it came time to add a 3rd video card to the loop I gave up and pulled all that stuff out of there.
 
Thats pretty bad ass with the liquid cooling but I couldn't help but notice how close the nozel is on the bottom most card, is it sticking out the bottom of the case?
Yes, it now has a mini on there which fits fine with the tubing. Oh, I have not bled all the air from the loop so don't nag . . .lol. Now it is not the best situation but it works. We are going to need something TOTALLY different for SkullTrail tho . . .


 
A couple of points.

First, my understanding is that the die shrink allows Nvidia to make the same product more efficient and cheaper to manufacture. So it makes sense that Nvidia is refining its 8 series. A lot of gamers are already benefiting from the 8800GT which has opened the door to performance previously reserved for the high-end. Not a bad thing.

As a higher end enthusiast I am indifferent about the lack of next gen cards. Normally, I would be upset but this time around something is different. The games are different. Almost all the "AAA" titles, with the exception of Crysis and perhaps a few others were co-developed for game consoles. Call of Duty 4, Orange Box, Unreal Tournament 3, Gears of War, and Bioshock have all been released over the past few months, but those are not GeForce 8 generation titles, or "next gen" titles by PC gamer standards. In fact they are XBOX360 and PS3 generation titles that got a PC version.

Something else is different. Call of Duty 4, Unreal Tournament 3, Gears of War, and Bioshock either have omitted anti-aliasing support or have limited anti-aliasing support. In these games anti-aliasing is essentially on or off, or at the most, 4x. Thus, unless you move to a higher resolution/larger monitor, there is little or no advantage to owning anything faster than an 8800GTX. With an over clocked 8800GTX, I can max those games out at 1680 x 1050 with the highest levels of anti-aliasing available and get frame rates that are very smooth.

Releasing a next generation card right now would be like releasing a new console that only has one game lined up. It doesn't make sense. The big game developers do not want to develop high end PC titles that cannot be ported to console. They want to develop console games that can be ported to PC. This is probably why Nvidia and Crytek appear to be working so closely togeather--they need each other to give their products a reason for existing.

Nvidia may very well be "sitting on" new tech, but trends in the PC game industry and game developers are what is causing it. The same trends will stifle the implementation of more advanced physics in PC games and the utilization of multi-core CPUs. Considering that games targeting niche markets of enthusiasts like Crysis are not selling well, I would say that PC gaming is in a bad way, from an enthusiasts point of view. We are used to PC graphics improving significantly on a yearly basis or at least every other year, but there is every indication that the major game franchises from this point on will optimize their games for consoles and then use the PC as an additional outlet for those games.

I fully expect, for example, Battlefield 3 to be closer in system requirements and image quality to Bad Company (console) than Crysis.

I agree with [H], as usual. The new high-end offerings and SLI setups really only benefit those with large monitors and high native resolutions. If you want to run all your games at 1080p on your HDTV or on one of the larger PC monitors you might appreciate the Nvidia roadmap. In fact as larger, higher resolutions become more popular the GX2 or multi-sli might seem more reasonable. Such cards might be necessary for optimal performance at those resolutions. But, for those of us with more conventional monitors looking for a boost in Crysis, this roadmap doesn't offer much. It isn't a practical approach, even by enthusiast standards.

The expense, heat, space and power requirements, in addition to the relatively small market for large, HD+ monitors and, most importantly, the lack of hardware pushing games justifies the conclusion that most [H] readers would not be interested in 3-way SLI.
 
ATI has left a power vacuum by not directly competing directly with the now aging 8800GTX and I agree, this has not given Nvidia much incentive to move the high end to a newer technology. At the same time if they really have spent the R&D on a significant upgrade from G80 (I consider G92 a much smaller evolution) they can't afford to sit on it forever - they need to recoup their money.

As for saying the current gaming scene doesn't need a new high end card - I can't agree. If the $250 G92 provides a performance level of X, I bet the market would happily buy a $500 single core card that performed at 2X straight up in the usual benchmarks and real world in GPU starved scenarios like Crysis. And Crysis is really not even close to being the only game demanding it. With 24" 1920x1200 monitors now routinely being availble for under $400, lots of gamers would like to be able to run any game at full res, with details maxed out and high IQ settings like AA and AF cranked. The 8800GT is good, but not good enough. Heck even the cheap ass $200-range 22" TN film monitors are 1680x1050 and people should be able to play that with full AA / AF and settings maxed.

Look at the 1920x1200 4xAA and 16xAF results from the 8800GT reviews - even older games like Oblvion (yes its an unoptimized pig) and Stalker get crappy frame rates at those settings. And with things stagnating, you're not going to see more games like Crysis (with better gameplay) coming out - do you really think that's a good thing? If a single card with 2x 8800GT performance (in a single GPU chip) came out now at the high end, more devs would target it as their target during current development as they'd expect it to be midrange in about a year to 18 months time, and lower-mid range performance cards a year from would make sure that a large segment of the PC gaming population could partake in such a game. Console ports are usually not well optimized and what ran smoothly on a console at 720p (or upscaled ~640p for games like Halo3, CoD4 and many others) will really tax a PC port running 1080p or 1920x1200 with AA and AF.
 
The only people I know stoked about Tri-SLi are "focus group members" who get some of their hardware free, and folks that work for XFX|Evga that likely get nice discounts.
 
The only people I know stoked about Tri-SLi are "focus group members" who get some of their hardware free, and folks that work for XFX|Evga that likely get nice discounts.

Well I didn't get any of my hardware for free outside of the motherboard. I've purchased enough motherboards this year (680i SLI boards) for that not to even count and I liked 3-Way SLI enough to use it in my own machine.

(BTW I don't get any discounts on the cards either.)
 
my n00b opinion: SLI really differentiates itself by maintaining frame rate and quality at higher resolutions. At lower resolutions most games become CPU bound before reaching the limits of a single G80 or G92, let alone 2 or 3 of them.

However, from the benhcies I've seen, the "sweet spot" for SLI is 2-way on 1920x1200 or 1080p HDTV, basically the 2 megapixel range.

The next jump up would be 2560x1600, where presumably 3-way SLI would be keen ...but due to non-linear scaling, lack of game optimization, or whatever failing you choose to cite, 4 megapixels (e.g. my 30" Dell) is just a bit much for the current generation of GPUs.

So really, the reason 3-way SLI isn't compelling: there are no displays in the 2.5 - 3 megapixel range, where the 3rd card would further differentiate the gaming experience. For now, I'll just have to "settle" for 1080p gaming on a 52" HDTV until the "next generation" GPUs become available ;)
 
Well I didn't get any of my hardware for free outside of the motherboard. I've purchased enough motherboards this year (680i SLI boards) for that not to even count and I liked 3-Way SLI enough to use it in my own machine.

(BTW I don't get any discounts on the cards either.)

That's cool, Dan,

Enjoy your gear. I did not mean to sound like a hater.

I have an overclocked 8800GTX in my main rig and love it. I really wish nVidia would release another single beastly GPU.

Personally, I am not going to spend close to 500usd or more for graphics solutions that lose half their muscle in some cases. It really doesn't matter if it's a GX2 card or traditional SLi|Crossfire.

If you're wondering why I bothered with HD 3850 cards in this rig it's because I was curious how far Crossfire had come, and they were supposed to be placeholders for HD 3870 X2 Crossfire. They were going to be in an Agena 9600 rig and the Agena overclock so poorly I changed my mind.

I think I am going to skip HD 3870 X2 Crossfire though. I really don't like the idea of spending close to 900-1000usd for two of those cards and sometimes being stuck with performance less than a single HD 3870. That would not work well on the 30" display I want to get.

I'll probably lose the 3850s and just get a 8800GTS 512Mb for this rig. I'll overclock the heck out of it of course . Sometimes I don't learn my lesson the first time around.
 
I think I am going to skip HD 3870 X2 Crossfire though. I really don't like the idea of spending close to 900-1000usd for two of those cards and sometimes being stuck with performance less than a single HD 3870. That would not work well on the 30" display I want to get.
If you wish to game on a 30" monitor at full native resolution a multi-card solution is almost a must unless you don't mind turning a lot of the IQ down like AA. It's either that or get a display with a good scaler chip to play the games at lower than native resolution, but a single card is not something I'd recommend for 30" especially considering the newer games.
 
Personally, I am not going to spend close to 500usd or more for graphics solutions that lose half their muscle in some cases. It really doesn't matter if it's a GX2 card or traditional SLi|Crossfire.

I don't know why people think this happens. It did sometimes with the 7950GX2 but not with the 8800GTX SLI setup. Granted not every game has perfect performance with it, but they don't lose half their muscle. At worst you need to switch beta drivers out every time a new game comes out to get the most performance out of it. Crysis is the exception to this rule.

I think I am going to skip HD 3870 X2 Crossfire though. I really don't like the idea of spending close to 900-1000usd for two of those cards and sometimes being stuck with performance less than a single HD 3870. That would not work well on the 30" display I want to get.

Don't count the 3870 X2 out just yet.

If you wish to game on a 30" monitor at full native resolution a multi-card solution is almost a must unless you don't mind turning a lot of the IQ down like AA. It's either that or get a display with a good scaler chip to play the games at lower than native resolution, but a single card is not something I'd recommend for 30" especially considering the newer games.

Agreed.
 
I don't know why people think this happens. It did sometimes with the 7950GX2 but not with the 8800GTX SLI setup. Granted not every game has perfect performance with it, but they don't lose half their muscle.

What I meant was there are some games that simply do no work with SLi|Crossfire. I suppose I expressed it poorly.

I am putting the purchase of a 30" monitor on hold indefinitely. Right now I use 1680x monitors on my rigs with moderate filtering. I really don't need uber graphics card setups for that.
 
What I meant was there are some games that simply do no work with SLi|Crossfire. I suppose I expressed it poorly.

No there aren't. I play all the mainstream games out there for the most part and they have all worked flawlessly with SLI or Crossfire. Even Crysis gets a bennefit from SLI though small.

If you can provide a list of games that do not work at all with SLI and get absolutely ZERO bennefit from it I'll listen, but otherwise I call BS on this.
 
No there aren't. I play all the mainstream games out there for the most part and they have all worked flawlessly with SLI or Crossfire. Even Crysis gets a bennefit from SLI though small.

If you can provide a list of games that do not work at all with SLI and get absolutely ZERO bennefit from it I'll listen, but otherwise I call BS on this.

Well playing the devil's advocate... some games do loose performance or show no benefit when they are first released / before drivers are released for them.
 
Just kinda wanted to toss in the fact that a few people have been tossing around WoW as a game that doesn't work w/SLI. Think about it though, WoW... Do you really need SLI for WoW?
 
Finally running Tri-Sli here.
And ooooh my gawd do i luv it :D
COD4 all ghillied up 2560x1600 2xaa 16xaf =steady 60fps (vsync) :eek:
Can't wait for a trio with G100
 
Finally running Tri-Sli here.
And ooooh my gawd do i luv it :D
COD4 all ghillied up 2560x1600 2xaa 16xaf =steady 60fps (vsync) :eek:
Can't wait for a trio with G100

I've got COD4 running at 2560x1600 with 4x AA and 16x AF with V-Sync and it rarely drops below 60FPS.
 
I have just begun my Triple-SLi goodness. Saving for the 30" monitor, but I have to admit, I'm replaying Crysis in DX10 at very high settings.......first time through it was in XP......at 1920 x 1200 and it's like a different game. The visuals are stunning.....I just wish the game overall would improve as much as the visuals did.

Next up.....UT III.:D
 
I've got COD4 running at 2560x1600 with 4x AA and 16x AF with V-Sync and it rarely drops below 60FPS.
Awesome. Too bad none of my motherboards support tri-SLI for my 30" display.
 
I have just begun my Triple-SLi goodness. Saving for the 30" monitor, but I have to admit, I'm replaying Crysis in DX10 at very high settings.......first time through it was in XP......at 1920 x 1200 and it's like a different game. The visuals are stunning.....I just wish the game overall would improve as much as the visuals did.

Next up.....UT III.:D

UT3 is the same for me regardless of whether I'm using 2-Way SLI or 3-Way SLI.
 
Oh well, its still good, right?

It runs pretty good. The game is severely CPU limited though.

I could do 3.2GHz or 3.4GHz on my 680i SLI reference board. On my Striker Extreme I can't get past 2.83GHz and that's not even reliable. Finally I gave up on it and went back to my stock clocks until I replace the motherboard with something else. At 2.4GHz the performance dropped significantly. At 3.2GHz it was smooth as half-melted butter.
 
Well I get my new 780i motherboard on Monday and I've been toying with the idea of tri sli. Just wondering if my PC power & cooling 1000 watt SR will be enough to support tri sli?
 
Well I get my new 780i motherboard on Monday and I've been toying with the idea of tri sli. Just wondering if my PC power & cooling 1000 watt SR will be enough to support tri sli?

Your PC Power & Cooling 1kw unit is definitely powerful enough. I'm not anywhere near the 1kw draw mark under full load even including my monitor.
 
I've got COD4 running at 2560x1600 with 4x AA and 16x AF with V-Sync and it rarely drops below 60FPS.

Did we tested the same level (all ghillied up)?
With all that gras,water and smoke it jumps 40-50-60fps @ 4xaa
Else it might be because you've got a quad.
Altough a quad wont do much over a duo in gaming.
 
Did we tested the same level (all ghillied up)?
With all that gras,water and smoke it jumps 40-50-60fps @ 4xaa
Else it might be because you've got a quad.
Altough a quad wont do much over a duo in gaming.

I've got every option in the game maxed out.It slows down a little when there is a lot of smoke, but the FPS hardly ever drops below 60fps otherwise.
 
I've got every option in the game maxed out.It slows down a little when there is a lot of smoke, but the FPS hardly ever drops below 60fps otherwise.

Reinstalled vista and lost the framedrops here.
Runs very smoothly now @ 2560x1600 4xaa 16xaf (60fps+)
Seems like swapping my mainbopard without reinstalling vista wasn't that good idea :D
 
Reinstalled vista and lost the framedrops here.
Runs very smoothly now @ 2560x1600 4xaa 16xaf (60fps+)
Seems like swapping my mainbopard without reinstalling vista wasn't that good idea :D

It usually isn't.
 
Why exactly was my post asking about crysis performance deleted? Is linking to anandtech's results showing a tiny performance difference over 2 ultras not allowed or something?
 
Transparency set to supersamples is always a performance killer. It isn't really a practical way to go in most games. I have found multisamples, however, provides an excellent quality/performance ratio.
 
Back
Top