NVIDIA's SLI Shortchanges Gamers?

^eMpTy^ said:
lowly in price...not in core speed, number of pipelines or memory bandwidth...

...the x800xl can't get playable framerates in Doom3 at 1600x1200...even the 6800nu can do that...
I don't think you can deal with the fact Doom3 sucked and no one plays it anymore. More on to a good game please.
 
R1ckCa1n said:
I don't think you can deal with the fact Doom3 sucked and no one plays it anymore. More on to a good game please.

haha...looks like you're the one having difficulty dealing...

the doom 3 engine is going to be one of the most licensed engines ever...Quake 4 is already in production and sure to be a huge game...everyone else seems to understand this...i don't know why your'e in denial rick....
 
Terra said:
Because people like me(going to be 30 years this year) still remember Doom and DoomII and I had a BLAST going oldshcool again...and as far as I know Doom3 didn't promise anything else...

Terra - How many of the one who bitch over Doom]|[ played 1&2?

QFT...I liked the damn game!
 
And then the fact that Carmack makes game-engines that are very modable ;)
Look how far the Quake3 engine has come ;)

Terra...
 
^eMpTy^ said:
haha...looks like you're the one having difficulty dealing...

the doom 3 engine is going to be one of the most licensed engines ever...Quake 4 is already in production and sure to be a huge game...everyone else seems to understand this...i don't know why your'e in denial rick....

I dont know of any games that use the doom 3 engine being developed (save for quake 4?) and I cant say i'm excited at all about quake 4.

Deathmatch is dead.
 
Doom 3 community is pretty much dead as well, one of the quotes from one modmaker i remember was "the doom 3 engine is only good for making games like doom 3". Id's support for modders apparently blows, and the only game ive heard thats gonna use the engine is quake 4. The engine has alot of potential obviously but the netcode seems to suck royally as well, and thats even with its very limited implementation of physics.
 
Well, I have sat back and watched the thread, and I think the biggest gripe I'd have with SLI is the driver/game. I hope the tech matures, perhaps they can write a unified driver for it. If not, it could be the achilles heel for the product. Look at it this way, there are always game/HW/driver incompatibilities. Generally these get worked out (over time), but now we're adding a layer. This is one that coded per game, given their history with numerous drive/game issues I don't expect miracles. Also, if this layer fails, we won't see the typical crash, your video processing will be cut in half.

I'm willing to give time, but if this is a problem with every new game, I don't see how people will be willing to jump on the bandwagon so quick. That being said, my next mobo will probably be SLI capable.

Terra, BTW that *is* a feminine name because of the root in latin. Anything ending in -a/-ae (latin) is feminine. :)
 
It seems that the unreal engine is currently the most used world wide in games.
moreover, it seems that the nvidia cards play with the unreal engine 3.0 very well.
of course when that comes out it won't be able to do 1600x1200 with 4aa 8af etc.
but an sli 6800gt will certainly be able to play at 1600x1200.
the main good thing about sli, is that a year from now, when all the games that need sli will be sli capable, and nvidia will be working hand in hand with games developers to make their games sli capable (probs for free) it will double your graphics power.
sli is overkill now, except for one game, that being everquest 2.
other than that, it will only be needed for 1600x1200 4aa 8af 6 months from now
my 2 cents
 
Flashback to 1997 with Voodoo and PowerVR (PCX2). If a game didn't offer native support, then you always had to download a patch. Flash to now, and we've got the same thing, just now we need patches to play D3D and OGL instead of propriatry API. :eek:
 
^eMpTy^ said:
haha...looks like you're the one having difficulty dealing...

the doom 3 engine is going to be one of the most licensed engines ever...Quake 4 is already in production and sure to be a huge game...everyone else seems to understand this...i don't know why your'e in denial rick....
Quake 4 is sure to be a huge game?

Its the first Quake game NOT developed by ID.

Its going to be the Twisted Metal 3/Mortal Kombat 4 of video games. EXTREMELY overhyped with a complete lack of innovation. If it weren't for the mod community, Quake 3 would have died out within a week. Think about a single Quake 3 match you have ever played without using SOME kind of mod. No such thing.

Quake 4 will suck.

edit: By the way...Quake 3 CTF sucked ass natively. Every single multiplayer gametype in that game had to be modded in. Carmack really took advantage of the mod community with that one, and I don't think they are ready to forgive him.
 
Chris_B said:
Doom 3 community is pretty much dead as well, one of the quotes from one modmaker i remember was "the doom 3 engine is only good for making games like doom 3". Id's support for modders apparently blows, and the only game ive heard thats gonna use the engine is quake 4. The engine has alot of potential obviously but the netcode seems to suck royally as well, and thats even with its very limited implementation of physics.
I don't know shit about making mods, but that quote sounds like a death knoll for Doom 3 mods (or future games, for that matter.)

If anyone has seen the screen previews of Quake 4...it looks just like Doom 3, so i would say there is probably some truth in that quote.

I wonder how much of the netcode modmakers and such rely on? I guess if you are going to use the Doom 3 engine, you don't need the netcode in most cases, right?
 
Leggir said:
Flashback to 1997 with Voodoo and PowerVR (PCX2). If a game didn't offer native support, then you always had to download a patch. Flash to now, and we've got the same thing, just now we need patches to play D3D and OGL instead of propriatry API. :eek:

I dont remember downloading any patches?
 
Impulse said:
You weren't around for the pre-Direct3D era then... Even during D3D's beginings a lot of games still used Glide (3Dfx's propietary take on OGL) games required patches for 3D hardware support and D3D itself was a huge mess. Lotsa users spent weeks installing and re-installing various D3D versions to get all their games working in oder, not a pretty era at all.

We are talking about SLI support?

I never had to download a driver for SLI, (not entirely true, they did have updates to the card driver).

And I am 45 years old , I was around when Glide was king, opengl a wannabe and D3D a "huh?", whats that?" ;)

Brent
 
Right you are, I didn't read all the posts above that one and got my wires crossed regarding the discussion which is why I edited my post above before ya quoted me... 3dfx's SLI came at a time when their line was pretty mature so it obviously didn't suffer from any of those problems.

That being said, it was also implemented very differently from the current "SLI" and despite all the whining in this thread (and the very technical explanations by some)... If NVidia somehow found a way to do SLI exactly like 3dfx did (which wouldn't be very feasible with current GPUs but hold your suspension of disbelief for a bit) most users would be criticizing it for it's lack of performance rather than driver issues.
 
^eMpTy^ said:
I'm sure it's true...but is it a big deal? I don't really think so...

Its not to us, it just makes DIY look that much cheaper when compared to places like Alienware.
 
It's not like it's gonna boost the price of those rigs substantially, and they're getting something back from NVidia in the way of marketing and free advertisement. When someone's buying a system of upwards of $4,000 like many of these boutique rigs, $50 just gets rolled into the total cost and doesn't make a difference.
 
Has anyone tried playing around with forcing SLI in 'unsupported' games.

I'm trying to get a boost here in NFS:U2.

I can force SFR by setting SLI rendering mode to 'multi-GPU rendering' as suggested by the article. However, nothing I do to the nvapps file seems to do anything at all... Quite bizzarre...

SFR is dodgy in NFSU:2 - the speed is less than using single GPU rendering mode, even though using the 'show GPU load balancing' clearly shows SFR is enabled, and working as expected...

Anyone else tried any of this?

2*6800GT
Asus A8N SLI
FX55
 
^^^^

Okay - the solution was simple - just restart :eek:

However, I'm *not* getting any performance increase from using SLI. In fact, there is a roughly 20% performance decrease from using either of the two modes :confused:

No rendering issues though...

What's going on?!
 
^eMpTy^ said:
I'm sure it's true...but is it a big deal? I don't really think so...

Well thats your opinion. You may think otherwise if you bought a pre-built system, and had to pay an extra $50 for some bogus tax. Their Ts&Cs are also borderline wacko. Having to give a PC to NV to check out, isnt a big thing for Alienware or the like. But for a smaller builder, the whole thing is going to hurt them.
 
Micro$oft has been doing essentially the same thing for many moons now. See where it got them...
 
Impulse said:
Right you are, I didn't read all the posts above that one and got my wires crossed regarding the discussion which is why I edited my post above before ya quoted me... 3dfx's SLI came at a time when their line was pretty mature so it obviously didn't suffer from any of those problems.

That being said, it was also implemented very differently from the current "SLI" and despite all the whining in this thread (and the very technical explanations by some)... If NVidia somehow found a way to do SLI exactly like 3dfx did (which wouldn't be very feasible with current GPUs but hold your suspension of disbelief for a bit) most users would be criticizing it for it's lack of performance rather than driver issues.

1) The Voodoo II line was only their second dedicated 3D chipset when it was released with SLI. The chipset was announced late 1997 and websites had testing samples in early 98 with Creative Labs selling the first cards Feb '98. 3dfx did make sure the darn thing worked right before they shipped.

2) No updated drivers were required for Voodoo II SLI, it worked at a hardware level. You can play DOS games with Voodoo II SLI, if the program called for glide the hardware would run.

3) The only patches that I recall were the ones that turned non-accelerated games into glide/opengl games like quake -> glquake.

4) Nvidia does not own the 3dfx method of SLI, as far as I know it was not part of the buyout.

"which wouldn't be very feasible with current GPUs but hold your suspension of disbelief for a bit"

Please explain this? I find your non-technical statement with nothing to back it up dismissing 3dfx SLI tech rather amusing. Once textures are loaded into the video cards local memory then what bus the card is riding on becomes a moot point.

There has been a few sites that have done performance testing with real world programs between 2/4/8x AGP modes and the performance between them has always been somewhat negligible.

As 3dfx predicted years ago fast local video card memory will alway be more important then storing textures in the main memory and then transfering across the AGP bus. This is one of the things that killed Intel's i740 as they tried to force people to store in main memory and move data as needed to the card with small amounts of ram. Intel never knew that ram prices would drop out of the sky making video cards with 16MB common place. A consumer video card with 256 MB of ram back in 1999 would have been thought of as the wildest fantasy and now it's commonplace.
 
Arseface said:
However, I'm *not* getting any performance increase from using SLI. In fact, there is a roughly 20% performance decrease from using either of the two modes :confused:

disable motion blur if you have it turned on and report back what happens.
 
gdonovan said:
1) The Voodoo II line was only their second dedicated 3D chipset when it was released with SLI. The chipset was announced late 1997 and websites had testing samples in early 98 with Creative Labs selling the first cards Feb '98. 3dfx did make sure the darn thing worked right before they shipped.

2) No updated drivers were required for Voodoo II SLI, it worked at a hardware level. You can play DOS games with Voodoo II SLI, if the program called for glide the hardware would run.

3) The only patches that I recall were the ones that turned non-accelerated games into glide/opengl games like quake -> glquake.

4) Nvidia does not own the 3dfx method of SLI, as far as I know it was not part of the buyout.

"which wouldn't be very feasible with current GPUs but hold your suspension of disbelief for a bit"

Please explain this? I find your non-technical statement with nothing to back it up dismissing 3dfx SLI tech rather amusing. Once textures are loaded into the video cards local memory then what bus the card is riding on becomes a moot point.

There has been a few sites that have done performance testing with real world programs between 2/4/8x AGP modes and the performance between them has always been somewhat negligible.

As 3dfx predicted years ago fast local video card memory will alway be more important then storing textures in the main memory and then transfering across the AGP bus. This is one of the things that killed Intel's i740 as they tried to force people to store in main memory and move data as needed to the card with small amounts of ram. Intel never knew that ram prices would drop out of the sky making video cards with 16MB common place. A consumer video card with 256 MB of ram back in 1999 would have been thought of as the wildest fantasy and now it's commonplace.

Ah, finally someone who has some brains.
 
gdonovan said:
Please explain this? I find your non-technical statement with nothing to back it up dismissing 3dfx SLI tech rather amusing. Once textures are loaded into the video cards local memory then what bus the card is riding on becomes a moot point.

The problem with 3Dfx's technolofy is that it involved assigning alternate lines to each video card.

That simply wouldn't work with today's technology - the GPU does much more than it did back in the times of 3dfx.

An example - take AA. The colour of a pixel is determined, if the pixel is at the edge of a texture, by the colour of the adjacent pixels. Hence you need to reference pixels below the one being rendered - as well as to the left and right of it. With 3Dfx SLI you would need to reference pixels which are not being rendered by the same video card. With the nvidia SLI - even in SFR, there is only one line of pixels which is at an interface between cards. Here it doesn't 'cost' much for both cards to render the extra line (so we have a 1 line redundency) to avoid these problems.
 
Spank said:
disable motion blur if you have it turned on and report back what happens.


Okay - I disabled motion blur. Didn't make a jot of difference unfortunately (didn't even improve framerates at all).

Anyway, here is what I'm talking about. All shots taken at 1600*1200 with 2xAA, taken at the same place (start of 'free run'):


This is single GPU mode (sorry about the crappy jpeg compression - hope you can still read my text - written out below)

singlegpu3lx.jpg

"runs at the same speed whether single gpu mode is forced in the nvapps config file, or the game is just left to run as default"



This is Alternate Frame Rendering:

afr7wu.jpg

"No noticable artifacts with or without motion blur - however performance is lower than with single GPU mode. Note that the load indicator is at zero. It remains here throughout the game, except at seemingly random places where it will jump up to about 1/4 load, very briefly."


sfr9bl.jpg

"Appears to behave 'properly' (although this screenshot obviously only shows output from GPU#1), with the load balancing bar moving around appropriately. However, performance is lower than both AFR and single GPU mode"


Quite irritating... I can't understand why AFR, if it has no artifacts, would be *slower* than single GPU mode... doesn't make much sense to me. What is clear is that it isn't behaving properly, as the load bar simply doesn't increase.

Anyone got any ideas?!
 
Arseface said:
Quite irritating... I can't understand why AFR, if it has no artifacts, would be *slower* than single GPU mode... doesn't make much sense to me. What is clear is that it isn't behaving properly, as the load bar simply doesn't increase.

Anyone got any ideas?!


What happens if you increase the AA and such? I didn't think NFSU2 was a CPU intensive game but that may be the reason.
There again I could be completely wrong since WoW on SLI caps at 100fps.
 
MeanieMan said:
What happens if you increase the AA and such? I didn't think NFSU2 was a CPU intensive game but that may be the reason.
There again I could be completely wrong since WoW on SLI caps at 100fps.

Nope - it's certainly not CPU limited.

I'm running on a 3.0Ghz FX55, and increasing AA settings, or altering the resolution gives the behaviour you would expect for a GPU limited game (at 1280 and 1600... it seems to be CPU limited at 1024 though).

Also - the scaling acts in just the same way with SLI as with single-GPU. SLI AFR is always about 20% slower than single GPU mode whatever the settings (at 1280 or 1600).
 
Arseface said:
The problem with 3Dfx's technolofy is that it involved assigning alternate lines to each video card.

That simply wouldn't work with today's technology - the GPU does much more than it did back in the times of 3dfx.

An example - take AA. The colour of a pixel is determined, if the pixel is at the edge of a texture, by the colour of the adjacent pixels. Hence you need to reference pixels below the one being rendered - as well as to the left and right of it. With 3Dfx SLI you would need to reference pixels which are not being rendered by the same video card. With the nvidia SLI - even in SFR, there is only one line of pixels which is at an interface between cards. Here it doesn't 'cost' much for both cards to render the extra line (so we have a 1 line redundency) to avoid these problems.

Hate to burst your bubble, but Quantum 3D's "Mercury Subsystem" was used to "emulate" the 3dfx Voodoo 5 5500's before they were made and also happened to support AA.
 
STEvil said:
Hate to burst your bubble, but Quantum 3D's "Mercury Subsystem" was used to "emulate" the 3dfx Voodoo 5 5500's before they were made and also happened to support AA.

It's not that it isn't possible - it's just that it would require much greater communication between the two cards. This, obviously, would introduce much more latency - and decrease efficiency. You want as little talking between the cards as possible - the most efficient way to do this is to let each card do as much as possible on it's own.

Also - IIRC the AA on the V5 5500 was FSAA super-sampling. Multi-sampling would require more communication between the cards.
 
With that bridge they are using I dont think communication is such a large deal, unless they were dumb when they designed it and the cards have crap for bandwidth on it.
 
Just a side note : SLI seems to be working in EQ2 in AFR + 71.21 drivers. Note that I have the bloom filter off though and am running with AA + AF enabled. 1600x1200 with 4x FSAA is quite a sight to behold in that game, and it runs fairly well on top of that.

EQ2 appears to be largely CPU limited though.

SFR causes some weird artifacts with that set however.

I just managed to put my SLI rig together and am running it through some paces now. Doom3 shows a nice healthy gain as expected, and HL2 performance is now through the roof.

There seems to be a serious issue with UT2004 right now though (least on my rig) going to have to experiment a bit to try + get that to work properly.
 
Back
Top