9800 GX2 Quad-SLI review

Hm, who knows, maybe the power consumption of it while playing Crysis may kill the pc!

Haha
 
The final driver release in the coming weeks will probably blow this one out of the water. In some of the games there was no difference what so ever. I still like to see a TRISLI 8800 Ultra OC vs 9800GX2 quad sli vs 3870X2 quad crossfire review though.
 
The final driver release in the coming weeks will probably blow this one out of the water. In some of the games there was no difference what so ever. I still like to see a TRISLI 8800 Ultra OC vs 9800GX2 quad sli vs 3870X2 quad crossfire review though.


Actually...the UT3 benchmarks reminded me of the first 7950 GX2 benchmarks, when Quad-SLI was slower in some cases than 7900 GTX SLI or a single 7950 GX2.
 
I had a 7950GX2 QUAD-SLI setup and it sucked. I know that SLI is better now, and those Crysis numbers do look good! I'd like to see the CPU running at something a little more normal, closer to 3GHz.
 
Impressive, but for $1200, seems like a waste of money, unless you got the bucks to blow...
One would do fine for anyone :)
 
I had a 7950GX2 QUAD-SLI setup and it sucked. I know that SLI is better now, and those Crysis numbers do look good! I'd like to see the CPU running at something a little more normal, closer to 3GHz.



Exactly... I'd love to see that Quad-SLI benched at 3 GHZ in those same tests.
 
This must draw so much electricity. I wish they put out the power consumption numbers. It would probably add a $120 a month or more to your monthly bill.
 
"1200 dollars" "120 dollars for electricity"
bah. this is the best and newest out there. its what we all dream about.
 
if paying more per month in electricity than the game costs is worth it .. then hey dig in..

IMO Single gpu solutions will always be best just wait for the next big boy on the block. This is just nvidia's quick fix to answer back to the 3870x2.
 
So, basically what I got from this review is that a pair of 9600GT's for ~$300 are almost comparable to 2 9800GX2's for around ~$1200.

Great value...:rolleyes:
 
So, basically what I got from this review is that a pair of 9600GT's for ~$300 are almost comparable to 2 9800GX2's for around ~$1200.

Great value...:rolleyes:

Look again. The 9800GX2 SLI gave almost twice the framerates on High in Crysis than 2x 9600GT. That is hardly "comparable".
 
Look again. The 9800GX2 SLI gave almost twice the framerates on High in Crysis than 2x 9600GT. That is hardly "comparable".

That's Crysis. Wow. I'm sick of this Crysis shit.

Look at the numbers man. Two cards for a fourth, A FOURTH of the cost can run games (except Crysis!!!) at comparable, yes comparable, framerates.

The 9800GX2 scaling is NOT good, and to top it off the 9800GX2 is not a good value card to begin with (Two GTS G92's can be bought for cheaper). It does absolutely horrible here. Horrible.

I'm not just going off this review. Check out [H]'s also. Two GT's (which are only a bit better than 9600GT's) DO have comparable framerates to a 9800GX2. So, yes, 2 9800GX2's will be faster than 2 GT's, but at another $600? Come on now.

If the scaling on the GX2 was better (like if it scaled like a 9600!), I'd see a greater value in it. But it doesn't. Waste of money right now.

If prices go down a bit, and scaling becomes better, I would maybe consider it.
 
That's Crysis. Wow. I'm sick of this Crysis shit.

Look at the numbers man. Two cards for a fourth, A FOURTH of the cost can run UT3 (except Crysis!!!) at comparable, yes comparable, framerates.

Fixed.

Looking at the UT3 review, a single 9800GX2 runs faster than SLI'd GX2.. perhaps its hinting that there are driver problems? I wouldn't get all worked up over this review if I were you, two games(one of them being problematic) and a 3dsmark benchmark hardly constitues as a review in my books.
 
I really would love to go for quad sli, but as it stands now to upgrade from my 8800 gtx sli rig would be foolish but i can only speak for myself. I would have to also buy another powersupply with 8 pin, i do not want to take a chance with two of those monsters with 6 pin adapters. 1200.00 + 389.00 for a 1200 watt ps = i always want the latest and greatest but spring is here i got other plans. Crysis is the only game i cant max im not going to sweat it.
 
Actually tylerr. Microcenter has a 750w Coolermaster psu on sale right now that is more than enough to run quad SLI.
 
We need to see a real review not some half ass 2 games tested and one only at 1600x1200 res review. This is lame!
 
Fixed.

Looking at the UT3 review, a single 9800GX2 runs faster than SLI'd GX2.. perhaps its hinting that there are driver problems? I wouldn't get all worked up over this review if I were you, two games(one of them being problematic) and a 3dsmark benchmark hardly constitues as a review in my books.

Interesting that you would come to the conclusion I'm getting all worked up over this review, when I clearly mentioned [H]'s review also.
 
I get 40+ FPS and very smooth gameplay in Crysis, all very high settings, DX10, Vista 64 bit with my Tri-SLi 8800 GTX setup.

Either this new card(s) is lame or the drivers are stinky.:eek:

I'd like to see some review pitting this Quad against Tri-SLi.
 
Interesting that you would come to the conclusion I'm getting all worked up over this review, when I clearly mentioned [H]'s review also.

lol was that before the edit or after? Regardless, I did not see your mentioning of Hardocp's review. I still stand by my statement that you are getting to worked up much too prematurely. Wait a month or two for better drivers, then we talk.
 
This must draw so much electricity. I wish they put out the power consumption numbers. It would probably add a $120 a month or more to your monthly bill.

More like 6 dollars a month. $120 a month will run you a big ass 1500 watt window air conditioner, 24 hours a day.
 
Actually tylerr. Microcenter has a 750w Coolermaster psu on sale right now that is more than enough to run quad SLI.

i would be extremely hesitant about putting quad sli on only 750w, i would go with at least 1kw (not saying that b/c i have a 1kw psu)
 
lol was that before the edit or after? Regardless, I did not see your mentioning of Hardocp's review. I still stand by my statement that you are getting to worked up much too prematurely. Wait a month or two for better drivers, then we talk.

My bad. I typed furiously to get that edit completed within moments after I posted. I wasn't fast enough.

Looking forward to getting worked up prematurely with you sometime in the future,

Z
 
i would be extremely hesitant about putting quad sli on only 750w, i would go with at least 1kw (not saying that b/c i have a 1kw psu)

A single gx2 only draws a total of around 400w on full system load. The card itself only claims 197w. A solid 750w psu (mind you the cm has 2 8pin and 4 6pin connectors) will be more than enough amperage for quad sli.

for reference.

http://www.microcenter.com/single_product_results.phtml?product_id=0265149

Although i'm leaning more towards this solely because it has a single large rail instead of 3 rails on the CM.

http://www.buy.com/prod/corsair-tx-750w-atx-12v-power-supply/q/loc/101/206178325.html
 
What I hate about these reviews is they never mention just wtf they're testing. Did they get 40fps at 1600x1200 on Very High in the Crysis benchmark or during actual gameplay? If it's during a stupid fly-by where everyone gets 2-3 times their game frame rates then it means Crysis is still not playable using very high settings at high resolutions.
 
What I hate about these reviews is they never mention just wtf they're testing. Did they get 40fps at 1600x1200 on Very High in the Crysis benchmark or during actual gameplay? If it's during a stupid fly-by where everyone gets 2-3 times their game frame rates then it means Crysis is still not playable using very high settings at high resolutions.

Your concerns are exactly mine, which is why one of the only review sites I trust currently is Hardocp.
 
What I hate about these reviews is they never mention just wtf they're testing. Did they get 40fps at 1600x1200 on Very High in the Crysis benchmark or during actual gameplay? If it's during a stupid fly-by where everyone gets 2-3 times their game frame rates then it means Crysis is still not playable using very high settings at high resolutions.


Yeah, many sites, including [H] use custom benchmarks for their tests. I know they use custom demos they feel are more indicative of true gameplay experience, but it makes it hard to compare the results to other reviews or your own system.

Digit-life.com sometimes will link to config files for custom time demos they've used to test video card performance. I wish more sites would do stuff like that.
 
Your concerns are exactly mine, which is why one of the only review sites I trust currently is Hardocp.


Unless i'm wrong, [H] doesn't tell you exactly how they test games, either. At least not for recent video card reviews using Crysis.
 
Unless i'm wrong, [H] doesn't tell you exactly how they test games, either. At least not for recent video card reviews using Crysis.

[H] explained how they test their games:
http://enthusiast.hardocp.com/article.html?art=MTQ2MSwxLCxoZW50aHVzaWFzdA==

But yeah, you won't always have the same results they do (and probably won't). However, I prefer the way they test as I feel it's closer to real world gameplay, and therefore gives me a better idea of what I can really expect from my video card's performance.
 
i thought quad SLI was useless because of somethign in DX that cant go ahead of 3 frames or something, why NVIDIA went to tri-sli ?
 
Maybe when a GOOD game comes out that benifits from quad SLI, it will be more worth it. Crysis blows.
 
[H] explained how they test their games:
http://enthusiast.hardocp.com/article.html?art=MTQ2MSwxLCxoZW50aHVzaWFzdA==

But yeah, you won't always have the same results they do (and probably won't). However, I prefer the way they test as I feel it's closer to real world gameplay, and therefore gives me a better idea of what I can really expect from my video card's performance.


I was referring to revealing a more exact methodology when testing various games with video cards.

I already knew they gave results the way they do, just to get people talking about them and drive more readers to this site.

Well...I guess I just addressed my own concern, eh?
 
I was referring to revealing a more exact methodology when testing various games with video cards.

I already knew they gave results the way they do, just to get people talking about them and drive more readers to this site.

Well...I guess I just addressed my own concern, eh?

I am not sure what exactly you are requesting, but you might start with asking me specifically.
 
I was referring to revealing a more exact methodology when testing various games with video cards.

I already knew they gave results the way they do, just to get people talking about them and drive more readers to this site.

Well...I guess I just addressed my own concern, eh?

They test the way they do to drive more readers to their site? I don't quite think that's the case, bud. There are many more well known sites that do artificial benchmark, and most don't even prefer the way [H] tests. All of my friends still use Tom's hardware because they offer graphs that can be easily deciphered and less reading (While [H] offers graphs also, they may be hard to understand for some, and they do get wordy in explanations). I'm sure [H] takes pride in that they evaluate cards differently, but I doubt it's "just to get people talk about them and drive more readers to this site".

It seems you're more annoyed at the reasoning sites have for evaluating video cards certain ways than the actual methodology.

As for your more 'exact methodology'...I'm not quite following. How exacty would you like an evaluation? For instance, would you like the evaluator to explain he just walked 3 steps north of the red building in CoD4, and that he just shifted the mouse 67 degrees?
 
Back
Top