XFX GeForce 9800 GX2 @ [H]

Yes it does; this card has built in SLI chip.

That's the biggest selling point.

It brings all the benefits of traditional SLI to non-nForce chipsets (and without any of the trade-offs required for traditional SLI, aside from the nForce chipset tax and traps):

1. Price - 680i SLI motherboards cost $100 more, on average, than their P3x/X3x Intel-chipset counterparts; the price differential jumps to $200 when you substitute 790i SLI. (Also, if you bring in G3x Intel chipsets, or even AMD-based IGP chipsets, most of which also include at least a single PCIe x16 slot, the differences only widen in terms of price.)

2. Stability - Kyle has commented rather long and hard on this issue, both in the reviews and here in the forums. Stability of Intel chipsets has continued to exceed that of nForce chipsets, even with the introduction of 790i.

3. Slot loss - Traditional SLI (and CrossFire also) both cause losses in slot availability due to both overhang and cooling requirements inherent in running two GPUs in two slots. Both the 3870X2 and 9800GX2 obviate this by being single-card solutions; however, the GX2 not only has higher performance, but is actually easier on the electric bill under load than the X2. The increased up-front cost for the GX2 vs. the 3870X2 can actually be made back in terms of electricity savings in less than two years. And there is the fact that you can use those slots that you normally wouldn't have available due to traditional SLI/CrossFire for things like sound cards, TV tuners, et. alia.

Sounds like the 9800GX2 is one heck of an *investment*.
 
That's the biggest selling point.

It brings all the benefits of traditional SLI to non-nForce chipsets (and without any of the trade-offs required for traditional SLI, aside from the nForce chipset tax and traps):

1. Price - 680i SLI motherboards cost $100 more, on average, than their P3x/X3x Intel-chipset counterparts; the price differential jumps to $200 when you substitute 790i SLI. (Also, if you bring in G3x Intel chipsets, or even AMD-based IGP chipsets, most of which also include at least a single PCIe x16 slot, the differences only widen in terms of price.)

2. Stability - Kyle has commented rather long and hard on this issue, both in the reviews and here in the forums. Stability of Intel chipsets has continued to exceed that of nForce chipsets, even with the introduction of 790i.

3. Slot loss - Traditional SLI (and CrossFire also) both cause losses in slot availability due to both overhang and cooling requirements inherent in running two GPUs in two slots. Both the 3870X2 and 9800GX2 obviate this by being single-card solutions; however, the GX2 not only has higher performance, but is actually easier on the electric bill under load than the X2. The increased up-front cost for the GX2 vs. the 3870X2 can actually be made back in terms of electricity savings in less than two years. And there is the fact that you can use those slots that you normally wouldn't have available due to traditional SLI/CrossFire for things like sound cards, TV tuners, et. alia.

Sounds like the 9800GX2 is one heck of an *investment*.

SLI is what will ensure that I don't buy this card...it might be it biggest selling point, but it is also what makes me stay clear of this card...I run Intel CPU on a Intel chipset because I like a bug free rig...SLI introduces bugs...so even if I am in the targetgroup, I have no interest...because of SLI.
 
Kyle has stated, rather plainly, his personal preference for Intel chipsets, despite their inability to support traditional SLI, due to their stability. AMD, despite the support for CrossFireX inherent in these same Intel chipsets, has failed to take advantage of it with a truly high-end performance heavyweight (I have no idea whether it's GPU issues or driver issues, but the failure is still there). Enter the 9800GX2: it is SLI-in-a-box, outperforms the 3870X2 (and performs identically to traditional SLI) uses less power than traditional SLI, uses less motherboard real-estate than traditional SLI, and is not as restricted in terms of motherboards as traditional SLI. Yes, it's more expensive than traditional SLI (if you look just at the card costs); however, you make at least half of that back by being able to select an Intel (or even AMD) chipset-based motherboard (on average, Intel and AMD-chipset motherboards cost $100 less than 680i SLI, and as much as $200 less than 790i SLI). There's also the advantage (if you already have an nForce-based motherboard) of not being trapped into doing without other upgrades just because you want to step up from your existing single 8800GT/GTS or 9600GT. The argument in favor of the GX2 is even more compelling if you own a single 8800GTX or 8800Ultra, as the GX2 costs less than a pair of 8800GTXs (or a single 8800Ultra), is easier on the electric bill than SLI'd GTXs (let alone SLI'd Ultras) and thus you can actually earn back the additional cost compared to the 8800GTX (and you actually save money compared to the 8800Ultra, and you keep right on saving in terms of electricity usage, not to mention higher performance vs. a single 8800Ultra). According to Kyle (through several remarks) stability has a value all its own, and it's enough to outweigh the performance advantages of traditional SLI. The 9800GX2 is the only solution that can actually render that argument moot, because it brings the benefits of traditional SLI to every chipset platform that has lacked it.

Well said!! ;)

So if a GX2 has a built in SLI chip, and two GX2's in Quad SLI connect via that external bridge, any chance Quad SLI could work on an intel based mobo's down the line?

I guess I don't understand what stops it from working if the GPU's have SLI chips on board...
 
Well said!! ;)

So if a GX2 has a built in SLI chip, and two GX2's in Quad SLI connect via that external bridge, any chance Quad SLI could work on an intel based mobo's down the line?

I guess I don't understand what stops it from working if the GPU's have SLI chips on board...


SLI's achilles heel....drivers.
 
Thanks... Dam drivers! ;)

So actually, if nVidia and intel would kiss and make up, Quad SLI could work on a Maximus with a signature on a contract, and a driver update...

Common boy's, be team players!
 
Thanks... Dam drivers! ;)

So actually, if nVidia and intel would kiss and make up, Quad SLI could work on a Maximus with a signature on a contract, and a driver update...

Common boy's, be team players!

Intel is planning the "larrabee" GPU line...they will not kiss and make up, they are sending a message...AMD awoke Intel...now they plan to take no prisoners.
 
Well said!! ;)

So if a GX2 has a built in SLI chip, and two GX2's in Quad SLI connect via that external bridge, any chance Quad SLI could work on an intel based mobo's down the line?

I guess I don't understand what stops it from working if the GPU's have SLI chips on board...

According to NVIDIA that can not happen. We received a message from NVIDIA concerning this same subject while working on the Skulltrail article. NVIDIA stated that SLI requires NVIDIA MCP's for SLI to function and that there is no getting around it. Infact the D5400XS Skulltrail motherboard has dual integrated nForce 100 MCP's onboard. Interestingly enough this is a pure hardware solution. No drivers are required for the nForce MCP's to be used properly. This lends some credibility to NVIDIA on this matter as far as I am concerned. From the 8-Series on it seems that this requirement has always existed.

SLI's achilles heel....drivers.

It is but it isn't. The driver situation is almost never as bad as people think it is. SLI has a rep for being a pain in the ass and it simply isn't all that bad.

Thanks... Dam drivers! ;)

So actually, if nVidia and intel would kiss and make up, Quad SLI could work on a Maximus with a signature on a contract, and a driver update...

Common boy's, be team players!

As I said above, no fucking way. To add to that NVIDIA knows that if they allow SLI on Intel chipsets their own chipset business in the Intel processor compatible motherboard market will die over night. Without the requirement of nForce MCPs for SLI compatibility NVIDIA simply couldn't sell chipsets like they do now because their products in that market aren't up to par with Intel's by any stretch of the imagination. Sure feature wise and performance wise they are close, but the quality control and drivers aren't anywhere fucking close.
 
According to NVIDIA that can not happen. We received a message from NVIDIA concerning this same subject while working on the Skulltrail article. NVIDIA stated that SLI requires NVIDIA MCP's for SLI to function and that there is no getting around it. Infact the D5400XS Skulltrail motherboard has dual integrated nForce 100 MCP's onboard. Interestingly enough this is a pure hardware solution. No drivers are required for the nForce MCP's to be used properly. This lends some credibility to NVIDIA on this matter as far as I am concerned. From the 8-Series on it seems that this requirement has always existed.



It is but it isn't. The driver situation is almost never as bad as people think it is. SLI has a rep for being a pain in the ass and it simply isn't all that bad.



As I said above, no fucking way. To add to that NVIDIA knows that if they allow SLI on Intel chipsets their own chipset business in the Intel processor compatible motherboard market will die over night. Without the requirement of nForce MCPs for SLI compatibility NVIDIA simply couldn't sell chipsets like they do now because their products in that market aren't up to par with Intel's by any stretch of the imagination. Sure feature wise and performance wise they are close, but the quality control and drivers aren't anywhere fucking close.
I'd be willing to bet that even if nVidia's chipset business dies as a result of making SLi work on Intel based motherboards nVidia would still come out on top because they would make up for all that lost revenue off of boosted video card sales. There are TONS of people out there (myself included) that completely avoid SLi simply because of how bad their 680i experiences were. Those people would come back to the SLi table once they made SLi work on stable Intel based motherboards. Sure Skulltrail exists, but most people think that is overkill. Think about how many video cards they'd sell if Asus released 2 X48 motherboard's next week that had SLi + Crossfire support (1 DDR2 and 1 DDR3). I will tell you right now that I would immediately buy 2 9800GX2's, the new X48 with DDR3 support, and 4-8GB of DDR3. The only thing holding me back from doing that is the fact that I'd have to buy a 780i or 790i motherboard.
 
Well most likely Nvidia is going to cave in on the SLI on intel boards when nehalem comes around. I say this because Intel is threatening to not let Nvidia have the rights to making the nehalem compatible boards.
 
According to NVIDIA that can not happen. We received a message from NVIDIA concerning this same subject while working on the Skulltrail article. NVIDIA stated that SLI requires NVIDIA MCP's for SLI to function and that there is no getting around it. Infact the D5400XS Skulltrail motherboard has dual integrated nForce 100 MCP's onboard. Interestingly enough this is a pure hardware solution. No drivers are required for the nForce MCP's to be used properly. This lends some credibility to NVIDIA on this matter as far as I am concerned. From the 8-Series on it seems that this requirement has always existed.
Thanks for the info Dan!

Looks like I can let that idea go, and the possibility of 3rd party drivers too.

If it's a hardware requirement, then drivers won't solve that...

So for us intel based mobo guys, it's grab the fastest single card that you can afford, and that will make you happy. I believe the GX2 is still the GPU for me! :)
 
Thanks for the info Dan!

Looks like I can let that idea go, and the possibility of 3rd party drivers too.

If it's a hardware requirement, then drivers won't solve that...

So for us intel based mobo guys, it's grab the fastest single card that you can afford, and that will make you happy. I believe the GX2 is still the GPU's for me! :)

Fixed ;)
 
This is the first time I've heard of this. Anybody know of a link that shows the differences?

Crysis uses Parallel Occlusion Mapping to make flat textures have volume and shape, it's not just simple bump mapping. It allows the engine to generate a 3D occluded appearance of rocks, stones, pebbles, etc. out of flat ground. Unfortunately it doesn't work well with ansiotrophic filtering and so AF is automatically off and you would not usually turn it on or use it with Crysis unless you want to detrimentally impact the graphical quality of the game. That's why it's not an option in the game's menu. You need to force POM off in the console in order to get visible AF.

Parallax Occlusion Mapping makes parallaxed ground textures look like this in Crysis. If you force it off in order to turn on AF, it would all be just a flat texture (though it would be sharper at distance and perspective). AF does work to an extent with POM on, and on many of the Crysis ground textures, but not on parallaxed surfaces like rocky areas or areas with depressions, sand, pebbles, dirt, etc.

POM texture:
parallaxocclusionmappinmi2.jpg


It is difficult to capture the difference in static screenshots. Try it yourself in game while moving around because motion affects it the most. Go into console and use: r_usepom 0 alt-tab out (I love how Crysis is so good with that) and force AF with Rivatuner to see the difference while moving around.

Here's an animated GIF from tweakguides
crysispomav8.gif
 
For the past generations, Nvidia's high end cards have cost that much and they've shown a great jump in performance. I've looked to see how the [H] may think how one 9800GX2 compares to two 8800GTX's in sli and since they only showed SLI performances between the 8800GT and 8800GTS 512 I looked at the reviews of those cards that did have 8800GTX comparisons and without any distinguishable difference (save for Crysis and high vs medium shaders) an 8800GTX performs just like an 8800GTS. So I'm thinking that if A ~ B, and B ~ C, then A ~ C which makes two 8800GTX's in sli perform just good as 1 9800GX2 and that guys seems to keep Nvidia's flagship gpu performance jump on track.

I have 2 8800GTX in SLi, i've compared my benchmarks to others with the same processor, mobo and 9800 GX2 and the GTX is definitely faster
at 1920x1200 with all the eye candy. The GX2's take a big hit when they run out of memory bandwidth (High res enabling AA and AF)

In order to beat 2 GTX's you are going to have to run two GX2's which I have been considering but I think I am going to wait and see what the 9800GTX brings to the table as a three way solution with 9800GTX might be the best of them all given that 3 cards scale better than 4. If 9800 GTX really is faster than the Ultra then 3x 9800GTX has the potential to be faster than 4x 8800GTS.

Too much speculation though...Have to wait and see.
 
The 8800gt sli seems like the winner, as expected. I figured this card would be overpriced for the performance it offers.
I didn't read through the other pages sorry if it has been asked already..
What I'm wondering, since they were doing 'apples' to 'apples', why not do a tri-fire or quadfire review beside the 9800gx2, as those can be had for a similar price for what the gx2 can be had for?
 
Nice review, er, I mean evaluation. :) I'm surprised the 9800 GX2 doesn't really do much more than a pair of 8800 GTs. :( This 9800 is good for those w/ only 1 PCI-e video card slot, but is it worth it for $600? Of course, prices will drop quickly. This time next year, they'll be in the $350 range.
 
Nice review, er, I mean evaluation. :) I'm surprised the 9800 GX2 doesn't really do much more than a pair of 8800 GTs. :( This 9800 is good for those w/ only 1 PCI-e video card slot, but is it worth it for $600? Of course, prices will drop quickly. This time next year, they'll be in the $350 range.

Why would this card be faster than two 8800GT's in SLI? It uses two G92 GPUs a 256bit bus and 512MB of GDDR3 memory. The clocks are slightly higher than 8800GT's but lower than 8800GTS 512MB cards. So the performance is exactly what I'd have expected. In fact it is actually better than I expected in a number of games.
 
Nice review, er, I mean evaluation. :) I'm surprised the 9800 GX2 doesn't really do much more than a pair of 8800 GTs. :( This 9800 is good for those w/ only 1 PCI-e video card slot, but is it worth it for $600? Of course, prices will drop quickly. This time next year, they'll be in the $350 range.

You're assuming (I think incorrectly) that the GX2 is aimed at the traditional buyer of nV PCIe graphics cards (they also have an nForce chipset and are thinking in terms of QuadSLI).

For once, I think that theory is all wet.

I actually think that the 9800GX2 is actually aimed more at the non-nForce chipset customer: the person that may own a computer with (most likely) an Intel chipset, yet wants the strongest graphical solution for gaming. QuadSLI is not really a consideration (that would actually be considered a niche market for the GX2); the real competition (on the same chipsets) remains the 3870X2, and it's also why (in my own humble opinion) performance of the GX2 on Intel (or even AMD) chipsets is more important (to nV) than performance on nForce. Prices may fall in comparison to the 3870X2; however, I doubt they will fall by much, considering the spanking the GX2 puts on AMD's dual-GPU part in terms of both performance and savings on the electric bill.

For that same reason, I'd like to see both Intel and AMD-based motherboards in future reviews of the GX2.
 
You're assuming (I think incorrectly) that the GX2 is aimed at the traditional buyer of nV PCIe graphics cards (they also have an nForce chipset and are thinking in terms of QuadSLI).

For once, I think that theory is all wet.

I actually think that the 9800GX2 is actually aimed more at the non-nForce chipset customer: the person that may own a computer with (most likely) an Intel chipset, yet wants the strongest graphical solution for gaming. QuadSLI is not really a consideration (that would actually be considered a niche market for the GX2); the real competition (on the same chipsets) remains the 3870X2, and it's also why (in my own humble opinion) performance of the GX2 on Intel (or even AMD) chipsets is more important (to nV) than performance on nForce. Prices may fall in comparison to the 3870X2; however, I doubt they will fall by much, considering the spanking the GX2 puts on AMD's dual-GPU part in terms of both performance and savings on the electric bill.

For that same reason, I'd like to see both Intel and AMD-based motherboards in future reviews of the GX2.

The motherboard chipset will be virtually meaningless in a graphics card test. The only time a difference will be apparent is if you ran those tests at ultra low resolutions. That proves nothing.
 
Why would this card be faster than two 8800GT's in SLI? It uses two G92 GPUs a 256bit bus and 512MB of GDDR3 memory. The clocks are slightly higher than 8800GT's but lower than 8800GTS 512MB cards. So the performance is exactly what I'd have expected. In fact it is actually better than I expected in a number of games.

Granted, the GX2 only takes up one PCI-e slot. However, I was hoping the "9" series would be an improvement over the "8" series, like the way "8" was far superior to "7". I kind of knew it wouldn't when the 9600 GT couldn't even beat the 8800 GT. (Yes, I realize the 2nd # is 6 and 8, respectively.)

As the evaluator said, if you have 2 PCI-e slots, it's no contest: two 8800 GTs is a much better buy than a single 9600 GX2. Now if price were no object and you just want max performance, then obviously 2 GX2 cards will win.
 
You're assuming (I think incorrectly) that the GX2 is aimed at the traditional buyer of nV PCIe graphics cards (they also have an nForce chipset and are thinking in terms of QuadSLI).

For once, I think that theory is all wet.

I actually think that the 9800GX2 is actually aimed more at the non-nForce chipset customer: the person that may own a computer with (most likely) an Intel chipset, yet wants the strongest graphical solution for gaming. QuadSLI is not really a consideration (that would actually be considered a niche market for the GX2); the real competition (on the same chipsets) remains the 3870X2, and it's also why (in my own humble opinion) performance of the GX2 on Intel (or even AMD) chipsets is more important (to nV) than performance on nForce. Prices may fall in comparison to the 3870X2; however, I doubt they will fall by much, considering the spanking the GX2 puts on AMD's dual-GPU part in terms of both performance and savings on the electric bill.

For that same reason, I'd like to see both Intel and AMD-based motherboards in future reviews of the GX2.

I agree that I'd like to see an AMD included in these evaluations. I suspect that while Intel kicks AMD's ass, that won't happen. :( These days, serious gamers build Intel systems.
 
Granted, the GX2 only takes up one PCI-e slot. However, I was hoping the "9" series would be an improvement over the "8" series, like the way "8" was far superior to "7". I kind of knew it wouldn't when the 9600 GT couldn't even beat the 8800 GT. (Yes, I realize the 2nd # is 6 and 8, respectively.)

As the evaluator said, if you have 2 PCI-e slots, it's no contest: two 8800 GTs is a much better buy than a single 9600 GX2. Now if price were no object and you just want max performance, then obviously 2 GX2 cards will win.

Again the added bonus to the 9800GX2 is that you can use it with Intel chipset based boards. You can't do that with 8800GT's in SLI. That counts for a lot.
 
Again the added bonus to the 9800GX2 is that you can use it with Intel chipset based boards. You can't do that with 8800GT's in SLI. That counts for a lot.

Are you serious? The 8-series cards didn't work in SLI on Intel motherboards? That doesn't seem right. :confused:

I have an AMD (Socket 939, yes, old school) motherboard, so I never even considered that part of it.
 
Are you serious? The 8-series cards didn't work in SLI on Intel motherboards? That doesn't seem right. :confused:

I have an AMD (Socket 939, yes, old school) motherboard, so I never even considered that part of it.

The 6-series and 7-series didn't work on Intel chipset motherboards after a certain driver revision. NVIDIA locked their drivers so Intel chipset based boards or any other non-NVIDIA chipset based board would be incapable of SLI support. This is a widely known fact. 8-Series cards and 9-Series cards don't work on non-NVIDIA chipset based boards either. The difference is that according to NVIDIA, an nForce MCP is required for SLI functionality. Prior to the 8-series it was not.

NVIDIA's chipsets aren't as good as Intel's and they probably know it. If they didn't lock out the drivers like that no one would buy their boards instead of Intel chipset based boards. It wouldn't make sense to do so. The price points are close, the overclocking is close and the performance is close, but the drivers and stability of Intel chipset based boards is unmatched.
 
The 6-series and 7-series didn't work on Intel chipset motherboards after a certain driver revision. NVIDIA locked their drivers so Intel chipset based boards or any other non-NVIDIA chipset based board would be incapable of SLI support. This is a widely known fact. 8-Series cards and 9-Series cards don't work on non-NVIDIA chipset based boards either. The difference is that according to NVIDIA, an nForce MCP is required for SLI functionality. Prior to the 8-series it was not.

NVIDIA's chipsets aren't as good as Intel's and they probably know it. If they didn't lock out the drivers like that no one would buy their boards instead of Intel chipset based boards. It wouldn't make sense to do so. The price points are close, the overclocking is close and the performance is close, but the drivers and stability of Intel chipset based boards is unmatched.
I see, thanks for the explanation. I do have an NVidia chipset motherboard, so I didn't consider the Intel chipset mobos. Drivers are the root of all problems. :mad: Speaking of drivers, I hope that's the issue why the 9800 GX2 failed Call of Duty 4 at 1920 x 1200 4X TRSS AA. :eek:
 
I see, thanks for the explanation. I do have an NVidia chipset motherboard, so I didn't consider the Intel chipset mobos. Drivers are the root of all problems. :mad: Speaking of drivers, I hope that's the issue why the 9800 GX2 failed Call of Duty 4 at 1920 x 1200 4X TRSS AA. :eek:

I'd bet that the drivers are the problem. I had the exact same issues with COD4 that Brent did on my own PC. I don't even think that he was using 1920x1200 4x TRSS AA. I think he was using just 4xAA and 16xAF enabled in the game menu. I know I had problems at 2560x1600 4xAA and 16xAF. My dual 8800GTX SLI setup had no such issue. Given how 8800GTS 512MB SLI setups perform in that same game I'm sure that it is in fact a driver issue. Hopefully it will be solved soon.
 
I'd bet that the drivers are the problem. I had the exact same issues with COD4 that Brent did on my own PC. I don't even think that he was using 1920x1200 4x TRSS AA. I think he was using just 4xAA and 16xAF enabled in the game menu. I know I had problems at 2560x1600 4xAA and 16xAF. My dual 8800GTX SLI setup had no such issue. Given how 8800GTS 512MB SLI setups perform in that same game I'm sure that it is in fact a driver issue. Hopefully it will be solved soon.

Yes, that's what one would assume, that a driver update will fix it. NVidia is pretty good about patching games, from what I've read. If it's a more serious bug, then those who want to play COD4 on a big monitor w/ the bells and whistles will have to "downgrade" to a pair of 8800 GT cards.

LOL, I read your sig. I see you love 3D Mark 2006. :p
 
Yes, that's what one would assume, that a driver update will fix it. NVidia is pretty good about patching games, from what I've read. If it's a more serious bug, then those who want to play COD4 on a big monitor w/ the bells and whistles will have to "downgrade" to a pair of 8800 GT cards.

LOL, I read your sig. I see you love 3D Mark 2006. :p

I just get sick of everyone treating 3D Mark 2006 like it is actually some kind of indicator of system performance when it isn't. On my Quad-SLI rig all I had to do was back the settings down to 256x1600 2xAA 16xAF to solve the problem. Hopefully they'll fix the drivers so that I can crank it up back to 4xAA. Crysis performance is amazing compared to what it used to be.
 
is this card the best card for resolutions of 1680x1020? i heard that this card only makes a difference on higher resolutions so if a 8800gts performs identical to a GX2 at this res there is no point looking at this card
 
I just get sick of everyone treating 3D Mark 2006 like it is actually some kind of indicator of system performance when it isn't. On my Quad-SLI rig all I had to do was back the settings down to 256x1600 2xAA 16xAF to solve the problem. Hopefully they'll fix the drivers so that I can crank it up back to 4xAA. Crysis performance is amazing compared to what it used to be.

I see. All benchmarks have flaws. That's why [H] prefers to do real-life "evaluations" rather than benchmark reviews. I'm sure NVidia is working hard to fix that COD4 bug.
 
I'd bet that the drivers are the problem. I had the exact same issues with COD4 that Brent did on my own PC. I don't even think that he was using 1920x1200 4x TRSS AA. I think he was using just 4xAA and 16xAF enabled in the game menu. I know I had problems at 2560x1600 4xAA and 16xAF. My dual 8800GTX SLI setup had no such issue. Given how 8800GTS 512MB SLI setups perform in that same game I'm sure that it is in fact a driver issue. Hopefully it will be solved soon.

Dan, I'm building a new system based on a EVGA 790i ultra and E8400. What cards would you recommend to play Oblivion, UT3, Bioshock, Halflife 2, etc at 2560x1600? I definitely want to use anti-aliasing.
 
is this card the best card for resolutions of 1680x1020? i heard that this card only makes a difference on higher resolutions so if a 8800gts performs identical to a GX2 at this res there is no point looking at this card

Yeah, I currently own the GTS 512MB and it runs all the games I have now pretty much maxed at 16x10. (yeah, I don't play crysis yet....) I recently upgraded to a 24" monitor, however, and it no longer runs everything completely maxed at 1920x1200. It runs some games maxed that high, but others need to drop a couple of things down a notch here and there. I've got the new 780i board coming in (step up from 680i via EVGA), and I'd been considering just going single card and selling the brand new 780i. This would let me run an Intel chipset (and make Dan_D proud :p ). I was hoping to be able to watercool this summer and hit 4.0GHz with my Q6700. I know some have done it with their Q6600/6700's, and I also know it's rare, but I thought my chances would be better with an Intel board. And I figured the 9800GX2 with it's "SLI in a single card" setup would do the trick for my higher res on the new monitor. I am within the step up window for the 8800GTS via EVGA as well.

Problem is, I wasn't expecting the $600 price tag... I can throw in a second 8800GTS 512, and overclock the crap out of the pair in SLI, and the 2nd GTS will only cost me $200 because of all the sales right now!

This has been a tough couple of weeks for me, though. I've been googling and poring over reviews, tests, forums, etc. for weeks, and I haven't found a board that does any better with the quads than EVGA's 780i, that's actually done quite well for an nVidia board. I'm probably gonna stick with the 780i for the long term. With a 24" inch monitor you definitely see a benefit from the 2nd card in an SLI setup.

I'm really holding out on the 9800GTX. If it's just a little bit faster than the current GTX/Ultra, but scales like the 9600GT, watch out! That's gonna be a hot setup, and should make the 9800GX2 pee its pants. nVidia's new cards are scaling so much better in SLI than in the past. Even the 8800GT's and new GTS's are scaling really well in SLI in most games.
 
Dan, I'm building a new system based on a EVGA 790i ultra and E8400. What cards would you recommend to play Oblivion, UT3, Bioshock, Halflife 2, etc at 2560x1600? I definitely want to use anti-aliasing.

The fastest card you can buy right now is the 9800GX2. At 2560x1600 that's the best option right now. You might even want to consider two of them in SLI. If you aren't going SLI then I'd take the 790i Ultra SLI option off the table and go for an X48 board instead.
 
The fastest card you can buy right now is the 9800GX2. At 2560x1600 that's the best option right now. You might even want to consider two of them in SLI. If you aren't going SLI then I'd take the 790i Ultra SLI option off the table and go for an X48 board instead.
Dan, would a single GX2 suffice for most recent games @2560x1600 with all IQ settings enabled? Which games would not excluding Crysis of course? TIA.
 
Dan, would a single GX2 suffice for most recent games @2560x1600 with all IQ settings enabled? Which games would not excluding Crysis of course? TIA.

Yes it would. Crysis is the one exception to that rule. Multi-GPU scaling past 2 GPUs practically makes the second card a waste of power. I've got two of them in Quad-SLI and the second one barely adds any performance. Right now 3-Way and Quad-SLI are for the e-peen and little else.
 
The fastest card you can buy right now is the 9800GX2. At 2560x1600 that's the best option right now. You might even want to consider two of them in SLI. If you aren't going SLI then I'd take the 790i Ultra SLI option off the table and go for an X48 board instead.

Thanks a lot for the input. I'll definitely be going SLI. At this point I'm considering the GX2 quad sli, 8800 GTX sli, or 9800 GTX sli. My primary concerns are the issues I've read with GX2 and sli at high resolutions. I'm also considering 3 way sli on the GTX boards.

Looking forward to your in depth review of the 790i ultra. It looks like the board for me but I want to wait to see how your results.
 
Back
Top