[H] users SLi guide

As far as I know, the answer is no. There is no high-end part I would more strongly recommend against buying at this point than the GX2.
 
Except the 9800GX2 offers twice the Folding@home performance of even the GTX280. That's probably not relevant for most consumers, but it is to me :)

I could get a GTX280 to go along with the 9800GX2 if I wanted to, but I'm just not impressed with how it's handling Crysis Warhead or Stalker. And since Vista 64 won't let you mix ATI and Nvidia drivers in the same system, I'm happy waiting for Nvidia's next generation of architecture (not the upcoming GT2xx shrink).
 
got crysis to work

in 64bit it takes 2gbs of ram!

i guess i need 8gbs now :D

avg fps is around 40-50 in all very high, so nice!
 
Haven't read thread (sorry). But any opinions on whether adding another 9800GTX in SLI to my system for e.g. FEAR, GTLegends would:

a) improving frame rate noticeably
b) manifesting any odd effects e.g. "micro-stuttering" (whatever that is)

I insist on Vsync. And use triple buffering (D3Doverrider), but not sure if it's possible or necessary with SLI'd performance.

Any thoughts?

XP Pro. 32
Abit IN9 32-Max
9800GTX+
Intel E6700 core 2 duo 2.67GHz
2G Corsair Twinx DDR2 6400C4
2x500G Seagate SATA2
X-fi Fatality
850W Enermax PSU
Dell 2407WFP (1920x1200x32)
No overclocking
 
Haven't read thread (sorry). But any opinions on whether adding another 9800GTX in SLI to my system for e.g. FEAR, GTLegends would:

a) improving frame rate noticeably
b) manifesting any odd effects e.g. "micro-stuttering" (whatever that is)

I insist on Vsync. And use triple buffering (D3Doverrider), but not sure if it's possible or necessary with SLI'd performance.

Any thoughts?

I have the IN9-32X also and I have two 8800 GTX's in SLI. I have benched with and without SLI on this motherboard with a similar processor and the benefits are dismall. Without AA or AF you can expect an 8 - 10 percent increase in performance with SLI. Enabling AA and AF maxing the quality features out yielded a %25 increase at best. I benched with the Farcry 2 benchmark at 1600x1200 resolution. I also tested cpu utilization in Fallout3 and it never went above 60% so I think there is a bottleneck in the hardware somewhere. The new x58 motherboards at the same cpu speeds will net almost a %50 benefit to using SLI so I'm planning to eventually migrate my video cards to an X58 board one of these days. If I was you I would get the best 280 card money could buy and use your other 9800GTX as a physX card. Oh but wait we would have to mod the new nVidia cards to fit into the IN9-32X becuase of a misplaced capacitor!

Dave.
 
This is probably a silly question, but if you want to avoid microstuttering with the 9800GX2 while playing certain games like Crysis, can you just turn off SLI in software?

Not that I'm aware of, and even then if you could disable SLi on a per-app basis then you wouldn't reap the benefits of the 9800GX2. It's SLi-on-a-stick, basically. Turn that off and you'd basically have an 8800GTS 512 with GT-like clocks.
 
I have the IN9-32X also and I have two 8800 GTX's in SLI. I have benched with and without SLI on this motherboard with a similar processor and the benefits are dismall. Without AA or AF you can expect an 8 - 10 percent increase in performance with SLI. Enabling AA and AF maxing the quality features out yielded a %25 increase at best. I benched with the Farcry 2 benchmark at 1600x1200 resolution. I also tested cpu utilization in Fallout3 and it never went above 60% so I think there is a bottleneck in the hardware somewhere. The new x58 motherboards at the same cpu speeds will net almost a %50 benefit to using SLI so I'm planning to eventually migrate my video cards to an X58 board one of these days. If I was you I would get the best 280 card money could buy and use your other 9800GTX as a physX card. Oh but wait we would have to mod the new nVidia cards to fit into the IN9-32X becuase of a misplaced capacitor!Dave.

Looking at a few benchmarks on the net, I get the impression the 9800GTX performed a little bit better than the 8800GTX in SLI, which I didn't expect with its memory limitation.

Generally, more difference (on their setups) at 1920x1200, and as you say with AA AF.

Regarding the IN 9's obstructive capacitor; have you considered a PCI express socket extender/riser? e.g.

http://linitx.com/viewproduct.php?prodid=10538

Lifts the card up 0.8",but wouldn't then line up with slot holes. Have to modify case.

Oh yeah....'micro-stuttering' is this, or any other weirdness, something I should realistically be expecting? If it's there, I'll be one to see it too.:rolleyes:
 
Looking at a few benchmarks on the net, I get the impression the 9800GTX performed a little bit better than the 8800GTX in SLI, which I didn't expect with its memory limitation.

Generally, more difference (on their setups) at 1920x1200, and as you say with AA AF.

Regarding the IN 9's obstructive capacitor; have you considered a PCI express socket extender/riser? e.g.

http://linitx.com/viewproduct.php?prodid=10538

Lifts the card up 0.8",but wouldn't then line up with slot holes. Have to modify case.

Oh yeah....'micro-stuttering' is this, or any other weirdness, something I should realistically be expecting? If it's there, I'll be one to see it too.:rolleyes:


You know, I had thought of that but never knew where to look to buy one. Also how do you bolt the card into the case so it doesn't sway or move back and forth? Would need a real long screw...
 
You know, I had thought of that but never knew where to look to buy one. Also how do you bolt the card into the case so it doesn't sway or move back and forth? Would need a real long screw...

Well, not sure how much lateral stability a screw would provide. Or what's needed to be honest. I was thinking of of some kind of "U" or "L" bracket. In my case (NPI), if I could get away with the original mounting plate staying and just removing some metal on other side of it for the TV out socket and card air vents (a few holes would be hidden otherwise) a "U".

I'm in UK and can't sign up the the US Abit forum. Maybe ask if anyone's got any ideas there?


What I really want to know from anyone, is about issues with SLI. Is it true triple-buffering doesn't work with SLI?. I must have Vsync as I can't stand tearing. Is triple buffering needed with SLI..anyone?
 
Well, not sure how much lateral stability a screw would provide. Or what's needed to be honest. I was thinking of of some kind of "U" or "L" bracket. In my case (NPI), if I could get away with the original mounting plate staying and just removing some metal on other side of it for the TV out socket and card air vents (a few holes would be hidden otherwise) a "U".

I'm in UK and can't sign up the the US Abit forum. Maybe ask if anyone's got any ideas there?


What I really want to know from anyone, is about issues with SLI. Is it true triple-buffering doesn't work with SLI?. I must have Vsync as I can't stand tearing. Is triple buffering needed with SLI..anyone?

You can't sign up for the Abit forum in the UK? Why? Tried using a proxy? About using an extender, I would probably also have to buy a flexible SLI bridge if I go the SLI route. But your sure the card placement is shifted? Would the metal edge pci fingers of the card still be able to slide down? Sounds like a pain. I think would rather dremel off a little of the offending plastic and take a chance that I wouldn't kill the card. Should be easy to do anyhow,

Dave.
 
You can't sign up for the Abit forum in the UK? Why? Tried using a proxy? About using an extender, I would probably also have to buy a flexible SLI bridge if I go the SLI route. But your sure the card placement is shifted? Would the metal edge pci fingers of the card still be able to slide down? Sounds like a pain. I think would rather dremel off a little of the offending plastic and take a chance that I wouldn't kill the card. Should be easy to do anyhow,Dave.

Dave,
Apparently it's since the forum/servers moved to Taiwan. I don't receive the signup emails. Haven't tried a proxy yet.

With that riser/extender, the card would be in the same position looking down on (perpendicular) to the mobo, but just be 0.8" further away from the mobo. The card would follow the same path into it as it did into the original slot but just find a new slot (the adapter) a bit earlier on the way down. Which means 0.8" of the outputs plate of the card would be behind the case - not lined up with the rear panel hole unless you enlarge it for access. I certainly wouldn't chop bit's off the card! It's worth more than the mobo, and I can't see the warranty surviving.
 
I mean if you get a good deal on a gtx260 and not pay too much I would chance it. Of course not a $500 video card. I remember once I bonded heatsinks onto my 9800 ATI AIW card and it still works... but the riser is a better idea I agree.

Dave.
 
Just a few notes on the discussion above.. I had two 8800GTXs on a 680i (gigabyte rev1 n680-sli dq6) and when I switched to my GTX260s, it was interesting to note that, in 3dmark06 all of my bench tests were identical to the 8800GTX SLI setup except for the HDR test (Canyon Flight HDR).. I think because most of that processing happens in the cards and goes directly to the display and is thus not limited by the mobo capabilities... anyways... so I think a pair of 8800GTX cards (or similar) does max out a 680i board in some ways.. granted, synthetic test are not always a reliable indicator, but I thought it was pretty interesting none the less.. it's possible my core2 @ 3.6ghz and memory at 800mhz is getting tapped out also.. maybe more horse power would tap into the SLI capabilities better... anyways.. I will be putting these cards on an x58 build soon, so I'll see what they're capable of then.. should be interesting.

as for the capacitor issue, the riser would be the best fix... then use some long screws to anchor the cards, maybe use some plastic or foam parts to fill in the open space.. that's what I would try. good luck!
 
I run at 1440x900 with a 3ghz core 2 e3110 (e8400). I have 8800GTS 512 SLI. I do gain with SLI even at my res but am thinking of selling both cards and grabbing a GTX260 216 black edition (666mhz). What do you think?
 
That's going to be a tough call on any performance gain... it might be about the same dep on the game/app. Dep on how SLI is scaling scaling on your machine, the GTX260 might be a tad faster I would guess for minimum FPS, which is very nice.. Are you after better performance or just wanting a better single card solution rather than having to use SLI?
 
That's going to be a tough call on any performance gain... it might be about the same dep on the game/app. Dep on how SLI is scaling scaling on your machine, the GTX260 might be a tad faster I would guess for minimum FPS, which is very nice.. Are you after better performance or just wanting a better single card solution rather than having to use SLI?
I honestly don't care about max FPS because it's all worthless fluff. I just don't want to see frame rate dip to where I notice.
 
This is what I'm pondering too (in reverse). I'm considering the cheaper option of putting another 9800GTX in SLI rather than a GTX280. But when compared to my old 8800GTX, the single 9800GTX does exhibit a lot more stuttering when there's a lot on the screen at my 1920x1200. E.g. in sim racing with lots of cars on the screen and bends where the scenery pans across fast it gets very stuttery. And if this is due to having less video memory, is it still going to happen with 2 in SLI? If so, it's pointless. This is where most benchmark tests tell us nothing useful.

Does anyone around here actually understand SLI well enough to know whether video memory is still limiting when SLI'd? By the sound of the last poster's situation it is.

In terms of average fps FWIW, looking at benchmarks I'd say the GTX260 at my resolution (1920x1200) seemed to be worth around 10-20 fps over the 9800GTX - which I believe is basically a slightly faster 8800GTS(?). But it's the min fps which is the issue.
 
From what I've read, if you have 2, 512MB graphics cards in SLI, you still only have 512MB total.
 
Well, depending on your rendering method, that's essentially right... most common rendering for SLI is "AFR" (Alternate Frame Rendering) so each card is handed every other frame to render, which is the whole screen, and the master frame buffer (the card the monitor is plugged into) then assembles those frames for final viewing... so in essence there is no video memory gain from going SLI with AFR. Each card is rendering a whole frame each time which requires it to allocate textures into it's memory for that whole frame.. which also puts additional strain on your machine sending two sets if textures (one to each card) so there is that to consider also..

So if you are at 1920x1200 and up going with a single card with 896+ would likely help with minimum FPS more than two cards with 512mb... but this is depending on some factors, like the rest of the system, the in game settings.. etc. SLI can be very powerful on the right system but it's not for everyone. Anyways, I hope this made sense... I am kind of wiped out from traveling all day. :p
 
Right now I'm running Fallout 3 at maximum details with 4xaa and 15 sample af and get 60FPS (vsync) but when I drop to just 1 card, it's around 40FPS on fraps, but is just really choppy. SLI even at my resolution (1440x900) helps :)
 
Why does VSync seem to mess up some games with SLI? I noticed a terrible jitter in movement when vsync was enabled on a few games (L4D, stalker clear sky, mirrors edge)

it also seems to bring in some screen tearing as well. Once vsync is off, movement, specially mouse movement, is silky smooth and the screen tearing goes away (except on mirrors edge, it's always showing the weird tearing re/draw line)
 
hrm... I dunno, it always makes the screen way smoother for me. I get noticeable tearing unless I use vsync on all games... what kind of monitor do you have and what res and refresh rate do you game at??
 
I have an LCD monitor running 60hz. I tried 2 monitors. An older one that does 1680x1050 tops and the other does 1920x1080.

Two games I am playing a bunch are L4D and Clear Sky. With vsync on, they are terrible to play. The game is running smoothly but as I move/look around, there is a HUGE jump. Soon as I turn vsync off it is fine. It is very weird and hard to explain without seeing it.
 
sounds like you're in the twilight zone. ;) seriously, that is strange.. sounds like it's enabled when you disable it.. have you tried leaving the in game setting "off" and forcing it on in the drivers? in the game's profile?
 
for anyone else, I saw this at the Steam forums. I didnt try it yet but definitely looks like what my issue was

"If you play with v-sync on (triple or double buffered), the game stutters in a very irritating fashion when turning the mouse left or right. Fix: set mouse filter (keyboard/mouse options) to DISABLED [confirmed]"

Have to look at this with other games as well.
 
Err, do we realise that with Vsync on - at least ordinarily - you do have to force triple buffering on with extra software e.g. D3D Overrider (Rivatuner), otherwise you're fps will drop down to about half in Direct3D games the moment you're below 60fps (or whatever your screen rate is) (?).

Actually, thinking, since I can't live without vsync (hate tearing), if it's problematic in SLI, then I'm not interested.
 
right.. but triple buffering does not work with SLI enabled... I have seen my fps go to 50 and 45 in lotro with SLI and vsync on.. it doesn't just plunge to 30 when it goes under 60... I guess it's possible the fps meter is not accurate but anyways.. with my new system I have yet to see anything below 60 though other than a disk access "blip" or something..
 
I'm benching FarCry 2 at 1920x1200 (native res) with SLI enabled I get a 33% boost over single card. Doesn't matter if its with AF/AA maxed out or not I still get a 33 percent increase with SLI. Is this a cpu limited event? or par for the course with SLI?

Thanks.
 
hmmm... yeah, probably a little... my 8800s always felt a little held back on my 680i c2d system also.. some games I got only 30% increases on and some I got more like 60%.. What level of AA are you running? I hear that makes a huge impact on that game... you would think SLI would shine while using AA but maybe it doesn't on this one? What driver and OS are you running?
 
I was thinking the same thing about the Texture size as being a factor in performance with SLI maybe. Have to test it out tonight. But it didn't matter if AA/AF was maxed out or not, I still go a 33 percent increase with SLI on though the AA/AF scores were less than without AA/AF. But always my cpu utilization was maxed. So tonight I'm going to bench it this time with the lower quality settings in FarCry 2 benchmark tool and see if the 33% boost increases. I was using the latest beta 182.05 drivers.

Anyhow it was weird that the boost was an exact 33.3333333333 % when I did my math so I'm wondering what else could be contributing to these scores?

Thanks.
 
crazy.. when you see numbers like that it does seem like you are hitting some limitation somewhere on the mobo... 768mb should be enough for 1920x1200 sized textures.. the 182.05 drivers are good so yeah, running those tests at various quality settings to see how that effects scaling would be good.. if you get the same 33.33% maybe try forcing the rendering mode to AFR1 or AFR2... try turning vsync off maybe? I dunno... I would be extremely careful in attempting a chipset driver update as I have totally fubar'd two OSes updating 680i chipset drivers... same thing as what happened to these people

http://forums.nvidia.com/index.php?showtopic=77884

so I would be extremely careful before attempting that if you are thinking about it.
 
I tried changing to AFR1 + 2, but it was only a percent difference. I thought of the vsync issue and haven't had a chance to test that out. Will do. As far as chipset drivers I already have the latest installed. Will do more tests tonight and report back. But its got to be vsync I'm betting. 33.333333% is too clean of a number. Also going to install the original Fear to eliminate the CPU from being a limiting factor....


Thanks.
 
Just one other tid bit.. when I upgraded my 8800s for 260s on the 680i mobo, I ran 3dmark06 and got the exact same score on the two first tests, but on the HDR canyon flight test my new cards blew the old ones out of the water... so for sure I had hit some limit on the mobo I think, even with the 8800s... but the HDR stuff is so gpu bound that it was able to run free... I am guessing.. so this is kind of pointing at something limiting on the mobo again, unless vsync has capped it off some how.
 
Guru3d has a very interesting review on the 280's in SLI vs no SLI against the 9800GTX. Lots of cool flash based bar graphs. He compared an x58 to a 680i board and the differences were grand until you started getting to 1920x1200 and beyond.

Here's the link: http://www.guru3d.com/article/geforce-gtx-280-sli-triple-review-test/

So upgrading to 280/285 based card would definitely shine but that be limited to what platform your on. On my 680i 285 SLI would be a waste of money. My cards still have some more performance to be squeezed out of them. Even more so once I go with an x58 board...

Anyhow will do more bench tests if I don't go to the strip club first. Probably won't as I'm saving for that x58.

Thanks.
 
I didn't see x58 there.. were you thinking of this one?

http://www.guru3d.com/article/core-i7-multigpu-sli-crossfire-game-performance-review/10

(leaping to the page with the FC2 beches)... yes, it would appear as you get higher in res the platform becomes less and less of a factor... so more gpu bound.. which is pretty damn interesting... and would reinforce what we're thinking, the platform is holding the cards back from going gang busters at the lower reses.

I have my old 680i rig being prepped to take over for my very old now 24/7 rig and I am tempted to drop my 260s in that just to runs some numbers myself... if I do it won't be for a couple weeks but now curiosity is churning. :)
 
Yeah that was the one. Very interesting. Upgrading your motherboard and cpu with the same gpu's in sli could net you double the performance your getting now. Anyhow I'm benching right now. So far it seems like I get nearly the same score at 1280x1024 vs 1920x1200 indicating a big cpu bottleneck. Will publish them here when done....
 
I did some benching and the results are just flat. Not sure whats going on here?

Going to set physics and trees to low setting and see what happens...


farcrybench.jpg
 
This should scare some of you....

This was my first attempt at SLI...
SLISideSmall.jpg


It was late '04...I just found this pic buried in Photobucket! LOL

SWEET!!
 
Back
Top