Why SLI sucks and is a waste of time:

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
An interesting read on SLI.

Highlights:

Tomshardware said:
After several benchmark tests we noticed some relatively slow performance in two of our games. After turning on SLI HUD in the driver we saw that SLI was not active during these games. Even our efforts to force SLI operation through the expanded driver settings had no effect. After talking with NVIDIA we learned the cause: SLI is not available with some games. NVIDIA has so-called SLI Profiles for games that are defined in the driver. The driver recognizes the game via application detection and executes the SLI mode (split or balancing) designated for that profile. If no SLI profile exists for a game, there is no SLI rendering. It is not possible to force SLI mode or generate your own profile. According to NVIDIA however the driver already contains over 50 profiles for games running with SLI. For newer titles this therefore means that SLI system owners have to wait for a new driver. But even then there is no guarantee that SLI will be possible with a particular game.

According to NVIDIA there are games that are simply not compatible with SLI. Microsoft's Flight Simulator 9 and Novalogic's Joint Operations for example both cause problems. As of the test date we were unable to find out the precise reasons why. NVIDIA only talks about frame buffering techniques used in games of this sort that are problematic for SLI. Of the 10 games we included in this test, two of them were non-SLI-compatible.

That's bad news, IMHO.

SLI requires nVidia to code a specific profile for each game in order for it to work with it.

What do you think the odds are of nVidia bothering to SLI up older games like Grim Fandango or Monkey Island 4 or X-Wing Alliance?

Or what about modern graphics-intensive games that aren't mainstream. "Pacific Fighters", "Lock On: Modern Air Combat", "Dangerous Waters", etc?

Forgive me if I don't want to be bound to a manufacturer's proprietary standard to determine if a game I want to play will even run or not. (Alright, so I know it will run in non-SLI mode if it can't enable it....but, as Tom's Hardware noted...when run in non-SLI mode, you only have one card doing the work, but the SLI overhead is still there. So, in the end, it's *slower* than that one card would have been on its own. For a single 6800 Ultra, maybe 'a little slower than on its own' is fine - but what about users who want to SLI GeForce 6600GT cards? Or use that Gigabyte 2-core card? If the SLI profile for that game doesn't exist....suddenly performance in the game is going to SUCK!)

Amazingly, reading their article, Tom's Hardware noted that of the 10 games they tested, 2 didn't support SLI. And they were testing mainstream - MAINSTREAM! - games! If we can count on 20% of *mainstream* games not working with SLI....I shudder to think about backwards compatibility with it!

Seriously, if you think the case is mistated, take a look at the .inf file for the stereo drivers! 1200 game profiles. 1200, and, yet...the Sims and the Sims 2 are missing! X-Wing Alliance? Not there! The *demo* of it is, but, strangely, the driver released 3 years after the game only has the demo supported, but not the full game. X-Wing vs Tie Fighter? Nope. Longbow 2? Naddah. "Lock On" and the first "IL2" are in....the second IL2 (and third - "Pacific Fighters") are not.

The list of omissions goes on and on....and that's a driver that's been out and being updated for years (literally, at least 5 now). How much less support can we expect for something that has only just been released and is more complicated to configure, anyway?
 
dderidex said:
That's bad news, IMHO.

SLI requires nVidia to code a specific profile for each game in order for it to work with it.

What do you think the odds are of nVidia bothering to SLI up older games like Grim Fandango or Monkey Island 4 or X-Wing Alliance?

While I can see the problems that this will incur I really dont see somebody buying a close to $800-$1000 video setup alone just to play a game that came out like 10yrs ago :D

Even still what so know your playing X-wing alliance at 200fps instead of 380fps? I think the user of that setup will some how learn to live with only a measly 200fps
 
DemonDiablo said:
While I can see the problems that this will incur I really dont see somebody buying a close to $800-$1000 video setup alone just to play a game that came out like 10yrs ago :D

Even still what so know your playing X-wing alliance at 200fps instead of 380fps? I think the user of that setup will some how learn to live with only a measly 200fps

Forget the 6800 Ultra SLI setup, most users aren't going to do that, anyway.

The point is that SLI is being billed as an affordable "upgrade" solution. Buy one 6600GT now, buy another this summer for double the performance!

Or, worse, what about Gigabyte's card that uses SLI natively! Users buying that one don't get a choice.

With a 6600GT, you aren't getting '200 fps or 400 fps' in some of these games. With "IL2: Sturmovik Forgotten Battles" a single 6600GT at 1280x960 with 4xAA is getting 19.3 frames per second.

If SLI works with that game (doesn't yet) and doubles that....now you've got playable settings at 1280x960 with 4xAA. Awesome! If SLI *doesn't* have a profile for that game....now, you are stuck with less than 18fps!

See the problem? It's not *just* older games (and IL2 is starting to count as one), but less-than-mainstream modern ones, too. What do you bet "Pacific Fighters" never *gets* a profile?
 
It will also require Nvidia to release drivers more than three-four times a year......


I'm talking official drivers off their website.
 
What do you think the odds are of nVidia bothering to SLI up older games like Grim Fandango or Monkey Island 4 or X-Wing Alliance?
Little to none.. and frankly, they don't NEED too... Lets be perfectly blunt here..

If you are running an A8N-SLI you have got to be running, at LEAST.... an athlon a64 3000+.... .

Since games like those were written SO long ago.. The power of a single card (6600GT) + an a64 3000+ is going to FAR outpace what the game needs.. so.. the point of writing sli profiles for something like that is just silly.

Plain english..
Old games don't need todays SLI.

Or what about modern graphics-intensive games that aren't mainstream. "Pacific Fighters", "Lock On: Modern Air Combat", "Dangerous Waters", etc?
My prediction is that game AUTHORS will be creating profiles and submitting them to nvidia for inclusion. Additionally, even though nvidia claims "making your own profile is not possible" I give it a month before someones done exactly that.


Forgive me if I don't want to be bound to a manufacturer's proprietary standard to determine if a game I want to play will even run or not. (Alright, so I know it will run in non-SLI mode if it can't enable it....but, as Tom's Hardware noted...when run in non-SLI mode, you only have one card doing the work, but the SLI overhead is still there. So, in the end, it's *slower* than that one card would have been on its own. For a single 6800 Ultra, maybe 'a little slower than on its own' is fine - but what about users who want to SLI GeForce 6600GT cards? Or use that Gigabyte 2-core card? If the SLI profile for that game doesn't exist....suddenly performance in the game is going to SUCK!)
As I said.. -- It's not gonna suck.. Honestly, nothing out today needs SLI.. 1 card is more than enough for EVERYTHING. HalfLife2, Doom3, FarCry, and all these games that were so "intense" ran FINE on my lowly 9800NP.. The 6600s, x700s, 6800s, x800s.. are all quantum leaps over my card.. so WHAT if they don't run sli.
The 2 core card is still an SLI card, btw...

Amazingly, reading their article, Tom's Hardware noted that of the 10 games they tested, 2 didn't support SLI. And they were testing mainstream - MAINSTREAM! - games! If we can count on 20% of *mainstream* games not working with SLI....I shudder to think about backwards compatibility with it!

Seriously, if you think the case is mistated, take a look at the .inf file for the stereo drivers! 1200 game profiles. 1200, and, yet...the Sims and the Sims 2 are missing! X-Wing Alliance? Not there! The *demo* of it is, but, strangely, the driver released 3 years after the game only has the demo supported, but not the full game. X-Wing vs Tie Fighter? Nope. Longbow 2? Naddah. "Lock On" and the first "IL2" are in....the second IL2 (and third - "Pacific Fighters") are not.

The list of omissions goes on and on....and that's a driver that's been out and being updated for years (literally, at least 5 now). How much less support can we expect for something that has only just been released and is more complicated to configure, anyway?


As I said.. Xwing is ancient.. it doesn't NEED sli....And sims? Sims2? Does that even need a p4 or a 32mb video card jesus.. I have someone playing sims2 with gf2mx 64mb video on a xp1700+)

From my sim2 box :
Requirements : 800mhz (not even 1 ghz) and a t&l capable, 32mb video card


/edited for blonde moment.

How the hell did I start talking about P4's with this point?
 
Laforge said:
Little to none.. and frankly, they don't NEED too... Lets be perfectly blunt here..

If you are running an A8N-SLI you have got to be running, at LEAST.... a celeron D 2.53Ghz.... (and I doubt many people running an sli setup would be having such a crappy cpu.. so.. lets guess the "typical" user is going to have, say, a 2.8E lga775.

Since games like those were written SO long ago.. The power of a single card (6600GT) + a p4 2.8E with 1mb cache is going to FAR outpace what the game needs.. so.. the point of writing sli profiles for something like that is just silly.

Plain english..
Old games don't need todays SLI.


My prediction is that game AUTHORS will be creating profiles and submitting them to nvidia for inclusion. Additionally, even though nvidia claims "making your own profile is not possible" I give it a month before someones done exactly that.


As I said.. -- It's not gonna suck.. Honestly, nothing out today needs SLI.. 1 card is more than enough for EVERYTHING. HalfLife2, Doom3, FarCry, and all these games that were so "intense" ran FINE on my lowly 9800NP.. The 6600s, x700s, 6800s, x800s.. are all quantum leaps over my card.. so WHAT if they don't run sli.
The 2 core card is still an SLI card, btw...




As I said.. Xwing is ancient.. it doesn't NEED sli....And sims? Sims2? Does that even need a p4 or a 32mb video card jesus.. I have someone playing sims2 with gf2mx 64mb video on a xp1700+)

From my sim2 box :
Requirements : 800mhz (not even 1 ghz) and a t&l capable, 32mb video card

Umm I agree that SLi users aren't typically going to care about older games they'd have trouble with on Windows XP anyway. But umm.... a Celeron D 2.53 won't work in a socket 939 A8N-SLi that was built for A64 chips. :confused:

Beyond that SLi owners know that cutting edge technology has it's problems. Eventually these issues will get resolved.
 
And so what happens when you have your brand-spanking new Gigabyte 3D1 (with it's better-than-6800-Ultra-performance) and are getting tired of pwning up Doom3 and decide to pick up "Stalker" on the day it's released?

Oops, no game profile, performance is suddenly slower than a GeForce 6600 non-GT.

I'm sure when you are running your $500 card at performance levels that a $150 card could provide you'll be happy with nVidia's SLI implementation, too.
 
thats a big assumption there though, thinking that nvidia will wait till a game is released before they start making an SLI profile for it.that wouldn't be very smart of nvidia, especially for a popular game.

remember that nvidia and ati work with companies to get good performance on their cards during the development process.
 
doh-nut said:
thats a big assumption there though, thinking that nvidia will wait till a game is released before they start making an SLI profile for it.that wouldn't be very smart of nvidia, especially for a popular game.

remember that nvidia and ati work with companies to get good performance on their cards during the development process.
I just pulled that game out of the air, though.

Or do you really think nVidia is going to work with EVERY game publisher out there on EVERY game currently in development, and make sure to have a new WHQL driver (hell, even a 'leaked beta') on their site before EVERY SINGLE GAME LAUNCH?

How many massively graphics intensive games are coming out this year? Dozens? You really think nVidia will hit all of them?

And, again, it's a pretty huge difference in performance we are talking about! 2 SLI'd 6600GTs pulls better than a single 6800 Ultra's performance by quite some margin, while half a SLI'd 6600GT operating in "noncompatible" mode is halfway between a regular 6600 and a 6600GT. That's an ENORMOUS gap in performance to rely on nVidia hitting every single game every gamer could want to play.
 
Title compatability shouldnt even be an issue. Christ, even the Voodoo in SLI ran Doom3, a game that came out at least 8 years after the voodoo cards that had SLI capability.
 
The voodoo comparison is meaningless since that SLI and this SLI are nothing alike other than the acronym, which is a marketing maneuver. On the other hand, it's absurd to fret over nVidia coding for SLI. It would be suicidal from their standpoint not to ensure that the best and most popular games work with SLI. If all you care about is a handful of obscure titles that also happen to be very hardware-demanding (that combo rarely happens) then yeah, maybe you're screwed, but then again you're in the extreme minority.
 
dderidex said:
An interesting read on SLI.
What do you think the odds are of nVidia bothering to SLI up older games like Grim Fandango or Monkey Island 4 or X-Wing Alliance?

Why in the hell would you need to use SLI on Grim Fandango? That thing runs as smooth as butter on my 400mhz Celeron laptop.
 
inotocracy said:
Why in the hell would you need to use SLI on Grim Fandango? That thing runs as smooth as butter on my 400mhz Celeron laptop.

"Need" doesn't matter. $1000 in video cards should give you the best performance possible with any game right now. They never mentioned anything about only having compatability with select titles when they were hyping the technology, so that's probably why we're all surprised.
 
robberbaron said:
"Need" doesn't matter. $1000 in video cards should give you the best performance possible with any game right now. They never mentioned anything about only having compatability with select titles when they were hyping the technology, so that's probably why we're all surprised.

I feel the same way... but Grim Fandango? Come on :rolleyes:
On a side note, I'm sure everyone thought it would be transparent to the application-- an additional boost in everything rendered in OpenGL and D3D. Its too bad its not though, lets hope nVidia can fix this later on down the road.
 
it's new tech. of course there's issues. right now it's pretty much "look how much i can blow on hardware" lol i used blow and hardware in a sentence. so proud of myself. anyway that's all it is. this transitional period of new tech is confusing as hell!
 
dderidex said:
Forget the 6800 Ultra SLI setup, most users aren't going to do that, anyway.

The point is that SLI is being billed as an affordable "upgrade" solution. Buy one 6600GT now, buy another this summer for double the performance!

Or, worse, what about Gigabyte's card that uses SLI natively! Users buying that one don't get a choice.

With a 6600GT, you aren't getting '200 fps or 400 fps' in some of these games. With "IL2: Sturmovik Forgotten Battles" a single 6600GT at 1280x960 with 4xAA is getting 19.3 frames per second.

If SLI works with that game (doesn't yet) and doubles that....now you've got playable settings at 1280x960 with 4xAA. Awesome! If SLI *doesn't* have a profile for that game....now, you are stuck with less than 18fps!

See the problem? It's not *just* older games (and IL2 is starting to count as one), but less-than-mainstream modern ones, too. What do you bet "Pacific Fighters" never *gets* a profile?

After all this typing, you dont think Nvidia or Omega will be adding new games to that SLI list?
 
you're right...how dare nvidia go out of their way to give a performance boost in only the top games...(where its needed)...what a bunch of assholes...

:rolleyes:
 
but lets be realistic, sli is another gimmick to squeeze more money out of people who want to have "the best" when its really unnecessary and excessive.
 
LOL... I'll only go SLI if NVidia can promise me 2,000 FPS in Grim Fandango. I wont settle for 1,000FPS!!! :mad:
 
^eMpTy^ said:
you're right...how dare nvidia go out of their way to give a performance boost in only the top games...(where its needed)...what a bunch of assholes...

:rolleyes:

My biggest complaint is that every evidence is nVidia is taking the SLI upgrade path of all their current cores as part of their card cycle - and it really shouldn't be.

Is it a powerful gimmick that enables power-hungrey top-end mainstream games to perform their best? Seems like, yeah.

But here is some more thoughts for you....what if SLI doesn't work out for nVidia in the big picture? Doesn't sell well enough, widespread motherboard compatibility problems tanks it in the market, etc - something happens that means the GeForce 6-series the last nVidia SLI parts.

Do you really think nVidia would keep the driver team working on SLI driver updates for the next year? Next two years? Next three years? When you spend $1200 on hardware, how long do you expect it to last? Cross your fingers SLI is a market success! If the driver team ends up stopping work on SLI this year, games coming out in 2006 and 2007 *aren't* going to work on these cards!

Really, nVidia should treat this more like Voodoo2 SLI was - indeed, it seems only logical given that Voodoo2 SLI had a larger compatibility base that this does! Voodoo2 SLI was only ever a gimmick - something the absolutely silly wealthy could afford to blow money on to get the best performance, but NEVER considered a realistic option for the mainstream gamer.

nVidia's apparent approach of their SLI as a mainstream option is a mistake. Unless they *drastically* change how they handle their implementation, this is a BIG step back for nVidia in the "compatibility" department - and that has been pretty much the only thing nVidia has REALLY had as their advantage over ATI for some time now. To throw that away....

The article at the Inq, especially, if it's to be believed, is worrying.
 
personally i think this is a flaw...this is big tech now but in a year or so it will all be old news...you will see. i say this all the time but it's like the first gen hard drive mp3 players...ancient now...don't jump on the band wagon yet, wait for them to be like "ohh crap, this is an even better idea" the idea itself is excellent they just went about it all wrong, it should duel no matter what...end of story.
 
While Grim Fandango might not be the best example :) , having owned a Voodoo2 SLI setup back in the day, when it just worked w everything, if I were even thinking about an upgrade to a new SLI setup from Nvidia, it had better run everything, and not require drivers for each game to do it. Seems like 2 steps forward, 1 step back- the new SLI is faster, and it does 2d, buuut, it doesn't work with all games.
I mean, come on.
 
what would happen as time goes on and more games come out and MAYBE compatibilty is added for some older games...won't the drivers become huge and bloated with all the profiles to play each game, or would they have to do something like "loadable profiles". lame.
 
seasponge said:
what would happen as time goes on and more games come out and MAYBE compatibilty is added for some older games...won't the drivers become huge and bloated with all the profiles to play each game, or would they have to do something like "loadable profiles". lame.

Maybe they could release an SDK for people to create profiles for any old game that they want to make.
 
Sir-Fragalot said:
Umm I agree that SLi users aren't typically going to care about older games they'd have trouble with on Windows XP anyway. But umm.... a Celeron D 2.53 won't work in a socket 939 A8N-SLi that was built for A64 chips. :confused:

Beyond that SLi owners know that cutting edge technology has it's problems. Eventually these issues will get resolved.

Brain fart.. :)

For some reason I was looking at a lga775 motherboard and got confused.. /blond mode=off


I know I meant to say they'd be running at least an a64 3000+

gonna edit original post to clear up confusion.
 
I don't know about you guys, but I read up on the tech before I purchased my hardware, and realized that it relies on driver implementation for each game to properly utilize SLI. I have no worries whatsoever, though, because it would be completely retarded for them to not release profiles for new games coming out. And, not that I know much about what goes into making them, but it can't be very hard to write up an algorithm for determining the proper way to SLI a game.
 
robberbaron said:
Maybe they could release an SDK for people to create profiles for any old game that they want to make.

That's completely pointless though, because old games would run flawlessly on a new card, even if it's a single card with the overhead of SLI added on.
 
dderidex said:
I just pulled that game out of the air, though.

Or do you really think nVidia is going to work with EVERY game publisher out there on EVERY game currently in development, and make sure to have a new WHQL driver (hell, even a 'leaked beta') on their site before EVERY SINGLE GAME LAUNCH?

No..

I think the game authors aren't going to want their games to run like crap, so they'll be banging down ati and nvidia's doors to get plenty of insight on how to optimize for their hardware. Any game designer that does NOT have a close working relationship with nvidia, ati, intel, amd, and microsoft is doomed for failure (and EA wouldn't hurt, if you ever want the game to actually sell, that is...)
 
Dr. X said:
That's completely pointless though, because old games would run flawlessly on a new card, even if it's a single card with the overhead of SLI added on.

It would be addressing the compatability issue though.
 
MAngelo said:
While Grim Fandango might not be the best example :) , having owned a Voodoo2 SLI setup back in the day, when it just worked w everything,

My dual voodoo2 setup didn't work with ZORK.. or kingdom of kroz... or Commander Keen 1.. or Leisure Suit Larry... Or Kings Quest.. or Bubble Bobble! Dammit! I wanted 60 FPS in Bubble bobble!


Don't you see how silly your statement is when I say something like that?
 
seasponge said:
what would happen as time goes on and more games come out and MAYBE compatibilty is added for some older games...won't the drivers become huge and bloated with all the profiles to play each game, or would they have to do something like "loadable profiles". lame.

I think you need to read up on what the profile is actually doing. From what I read, all it determines is whether or not to use AFR or SFR (which is alternate frame rendering or split frame rendering) and, in the split frame rendering case, the algorithm for determining where to split the frame would be the same between all games. So, no, the drivers wouldn't become bloated at all.
 
Jay Style said:
__________________
://SLACKERSTYLE | AQUACOOL
; koolance pc3-720bk tower case
; danger den maze4-chipset block
; asetek waterchill hard drive cooler 5 1/4
; danger den nv-68 6800 cooler
; asus p5ad2-e premium
; intel pentium4 (755) 550 3.4 ghz
; ocz pc4200 ddr2 performance dc - 4x 512mb
; seagate sata barracuda v 160gb
; evga geforce 6800gt 256mb pci-e
; sound blaster live! audigy2 zs
; pc power & cooling 510w express
:// END .


Sig Limitation...

I got smacked for it...

10 Lines it used to be ....
 
robberbaron said:
It would be addressing the compatability issue though.

Compatibility in older or ancient games is not an "issue" at all, so no, there's nothing to be addressed.
 
I dont think its garbage, or a waste of time. But the lack of current games working in SLI, and the various problems its having, Im glad I tool the wait and see attitude.

SLI isnt for everyone, you really have to research and make sure the games you play the most, are SLI'able, if thats a word. If its not, it is now!

I had the itch to SLI like most people, but I waited instead. With ATi's next card tapped out, and up and running, Ill wait to see how fast it is. And if NV can fix the problems, and get some drivers that support more games in SLI.
 
Dr. X said:
Compatibility in older or ancient games is not an "issue" at all, so no, there's nothing to be addressed.

Then why are people crying about it? There seems to be something that they want, and I figured that this would give them what they want. I know that 700FPS is enough for quake 2, but some people might want the 1200FPS.
 
reminds me of someone asking a billionaire if they would be ok with being just a half billionaire...
 
seasponge said:
reminds me of someone asking a billionaire if they would be ok with being just a half billionaire...
well what if a billionaire was only a billionair in certain situations ;)
 
I think it may end up like Voodoo2 SLI where 6 months/1 year after that is released, along comes a single card (at the time it was TNT) that beats the 2 card set up with a cheaper price tag!

Another scenario is that may be that is Nvidia's new strategy. Get people to buy a 2nd card for faster performance instead of having to release a new card every 6 months.

Say 6800Ultra is comparable to X800Pro right now. While Ati madly pushes away with X800PE, X850 etc to take the performance crown, and by the time Ati doubles the performance of 6800Ultra ... Nvidia can just say .. oh, just buy another Ultra and now they are neck and neck again (and cheaper by the time). Beat Ati by the numbers, sweet!
 
Back
Top