XFX GeForce 9800 GX2 @ [H]

I will just chime in on 2 things:

1) Why on earth would they call it a 9800 series? It is just the G80 shrunk down to "Supposedly" increase performance and use less power. its just a product refresh. Not a new Product so to speak.

2) I will def be keeping my 8800 GTX's for another year or so. In SLI, I can carry 85+FPS in WIC, Battlefield 2, UT3 and some other games. I chose not to play Crysis at this time as my 2 cards will not be able to push those graphics. I always have all the "EYE CANDY" turned to the max as well as screen resolution.

Shame on you Nvidia. I was hoping for another breakthrough product like when the G80's came out in 2006. I have a feeling that ATI will leave them in the dust at some point in the near future....
 
Quote:
Originally Posted by TalonMan
I do think nVidia hit their 30% minimum speed increase over the Ultra!


Quote: DwellerInTheDeeps
How do you figure that?
If the GX2 had a minimum of 30% over the Ultra, then why would Kyle have just told me that it would not be worth obtaining a GX2 if you already own a GTX?
I'd think that a minimum of a 30% over an Ultra, which is already a little faster than a GTX, would be enough to make the opposite recommendation, especially considering he said he's running a GTX as well.
Unless 30% doesn't really add up to all that much, in which case it wouldn't be worth spending that kind of money when you already have a GTX.


From the reviews...

Using this review's numbers: http://publish.it168.com/2008/0315/20080315000604.shtml
It still holds true that the 3870x2 does well in 3DMark06, but slower in actual game play.

If my calculations are correct, and looking only at 1920 x 1200 res... (What I need)

---------------------------------------------- 3870x2 --------- 9800 GX2 --- %Diff
3DMark06 ------------------------------ 14,190 ---------- 14,167 ------ 23 Points lower
Crysis ----------------------------------- 13FPS ---------- 22FPS ----- 69% faster
Lost Planet ---------------------------- 24FPS --------- 31FPS ----- 29% faster
Lost Planet 8xAA/16xAF--------- 11FPS --------- 13FPS ----- 18% faster
Company of Heros ------------------ 56FPS --------- 77FPS ----- 37% faster
Word in Conflict ---------------------- 34FPS -------- 41FPS ------ 20% faster
Word in Conflict 4xAA/16xAF -- 14FPS -------- 26FPS ------ 85% faster
Half Life ------------------------------- 111FPS ------ 115FPS -------- 3% faster
Half Life 8xAA/16xAF ------------- 73FPS -------- 76FPS -------- 4% faster
NeedForSpeed ---------------------- 48FPS ------- 102FPS ----- 112% faster
NeedForSpeed 4xAA ------------- 33FPS --------- 91FPS ----- 175% faster
Unreal Tournament -------------- 112FPS ------- 109FPS -------- 2% slower
Quake4 ------------------------------ 115FPS ------- 120FPS -------- 4% faster
Lost Planet -------------------------- 11FPS --------- 13FPS ----- 18% faster
FEAR 4xAAx16xAF ------------- 87FPS -------- 79FPS ------- 10% slower

On average, the GX2 is currently 41% faster than the 3870x2 at 1920 x 1200 res.

I still think the GX2 is the best single card to have right now...



Using the anandtech review's numbers: http://anandtech.com/video/showdoc.aspx?i=3266&p=1

Putting the 8800 Ultra against the 9800 GX2...

If my calculations are correct, and looking only at 1920 x 1200 res... (What I need)

---------------------------------------------- 8800 Ultra --------- 9800 GX2 --- %Diff
Call of Duty ---------------------------- 70FPS -------------- 110FPS --------- 57% faster
Call of Duty 4Xaa--------------------- 55.7FPS ------------- 83.1FPS ------ 49% faster
Crysis High Quality----------------- 25.8FPS -------------- 39.4FPS ------ 52% faster
Oblivion --------------------------------- 46FPS ----------------- 84.3FPS ----- 83% faster
Oblivion 4XAA ------------------------ 36.1FPS -------------- 63.9FPS ---- 77% faster
Quake Wars -------------------------- 85.2FPS ------------ 120.4FPS ---- 41% faster
Stalker ---------------------------------- 48.8FPS ------------- 73.3FPS ----- 50% faster
Word in Conflict ---------------------- 30FPS --------------- 33FPS ------- 10% faster.

On average, the GX2 is currently 52% faster than the 8800 Ultra at 1920 x 1200 res.
 
What about SLI numbers with the 9800 gx2? Even though you are talking about $1200 for the setup, I would like to know how well it would scale in games.
 
hmmm..sooo tempted to step up my 8800gtx to the GX2....evga's site is getting murdered right now. i've only got a day or two to decide, my 120 day is about to run out. i do game at 2560x1600 on my dell 30'' and it looks like the GX2 does quite a good job at that res
 
Havn't read through the replies so i dunno if this has been mentioned, the review didn't seem to mention anything about load temps or noise output from the card?
 
Quote: TheGooch69
So this single card is a signicant increase over SLi 8800 GTS 512mb? Cool.


No, the same...

The XFX GeForce 9800 GX2 games extremely well, allowing a very high level of gameplay experience in Crysis and the shader intensive game Jericho. The GX2 performance matched that of GeForce 8800 GTS 512MB SLI.The real advantage to the GeForce 9800 GX2 is going to be its ability to allow non-SLI motherboard owners a better performing alternative than the current 8800 GTX.
 
Thanks Kyle for testing with more cards, more settings and more resolutions. Just like what I meant before. This is great however the data presentation could be better, a bar graph perhaps or maybe one big table instead of many smaller ones.
 
Again I thought it was a great review, let me help out the people who obviously can't read the review

1)
Preformance expectations
SLI GT < 9800GX2 < SLI GTS

2)
Price/Preformance
Decent to Good single card option if you DON'T own a SLI mobo.

3)
9800 GX2 expected to be the fastest solution avaliable. Obviously as such, it is not at a good price point.
 
In reading the DriverHeaven review, they noticed a nasty little bug where frame rates drop through the floor in Gears Of War. Those with 8800 SLI setups and/or 3870X2 how play Gears of War, is this a noticeable or annoying problem?

I just bought an XFX 9800 GX2 and I play GoW and just wanted to know if this is going to be a problem. Thanks!
 
Will the nVidia Systems Tools run on non-nVidia motherboards? I like the GPU overclocking features, looks very clean and simple. I was wondering if the nVidia Sytems Tools will work with non-nVidia boards and allow overclocking of nVidia GPU's. Thanks!
 
You want to avoid the 680i SLI and 780i SLI motherboards if it all possible. That alone makes the 9800GX2 an attractive choice over two 8800GT's in SLI.

Ok, I see some hostility around here most of the time towards certain hardware (for whatever reasons), but coming from the HardOCP mainboard dude this quite shocked me.

What is it that makes the 780i so terrible then? Is it the stability, life expectancy, amount of DOAs or overclockability? Would anyone who isn't going to overclock at all have to 'avoid' the 780i motherboards? Is there any reason for someone looking at a 780i+8800GT SLI setup to not buy it, when he doesn't care about overclocking?
 
RE: test on non-sli Intel board. Yes. Please.

I'm sick of nv chipsets. I would bail from my 680i bugfarm ASAP.
 
Not that I think Tom's Hardware is the best site out there (kind of dislike them overall in fact), but they did review the 9800 gx2 they got on an ASUS P5E3 board, a X38 chipset board. Place called T-Break used an old Blitz Extreme board, that uses the P35 chipset. I'll be lazy and make everyone Google for the reviews, but just glancing at them, results looked similar to the ones posted here and other places where they used NVidia chipset boards.

Sorry if someone already posted this info, haven't read very many posts in this thread, but just in case someone hasn't yet, I did want to add that.
 
nice comprehensive review and ready for the hard launch too.
I just wish you would have compared the 9800GX2 to the GT,GTS as a single card and not in SLI only. That would have given us a more in depth look at dual GPU scaling with non SLI boards..
Most of the games used in the review don't scale well in SLI at all and that might give single GT,GTS owners some more discretion to upgrade.. But I did like the VS comparisons, very well written..

As the 8800 GT's, GTS's have been already tested, you only have to go look at whats already been reviewed. If two 8800 GT's in SLI give a quite similar Performance Quotient, then it's assumed that one GPU would be Slower. Now whether or not you should upgrade, that my friend, is your decision.
 
I sure hope that Nvidia can do more with the 790i and DDR3, and while they're at it, fix some of the issues that have plagued their other products, ie: Poor PWM Cicuitry, Heat and of course Addon Board Goodness. Then it might be worth looking at seriously.
 
One of our major beefs with SLI right now is the lack of multi-monitor support with SLI enabled. We give praise to AMD for providing multi-monitor support with CrossFireX enabled to end users. NVIDIA needs to fix this. The shoe is on the other foot and SLI now looks broken compared to CrossFireX in terms of features. NVIDIA have informed us that a driver update is all that is needed to include this feature. We want them to provide this support immediately. SLI without multi-monitor support is SLI’s Achilles heel. There are some of us here at HardOCP that refuse to use SLI because of this missing feature alone.

That's me.
 
Will probably get buried in all the hub-bub. But when GX2 Quad-SLi is tested, would it be possible to squeeze in LoTRO DX10 performance?

That would be the only reason I'd go Quad. Eye candy in Middle-earth!
 
The performance doesn't justify the price, IMO. Even for non-sli motherboard owners. Now, if they came out with a 9800GT2 (8800GT in SLI on 1 card) for around $399 that'd be something I'd jump on.


And if it had decent cooling?
 
Decent performance but I was expecting the price to be closer to 450-500 bucks than 600. Guess I'll just have to make do with my GTS right now and wait for the next gen. Hopefully, something a little bit more cost effective will come out.
 
You can't look at percentages because they are very misleading. They use those for marketing so it looks like a significate upgrade yet 30% could be 10fps to 13fps or something (im not doing the math)

Look at the FPS rating themselves. If it goes from 30 to 60 thats good. If it goes from 90 to 120 woopie. 15 to 18??? who cares its un-playable as is.
 
For those wondering, i have had SLI GTS512's since mid december and have played through CoD4, Crysis, Gears of War and had 0 issues with framerates bottoming out. This is at 1920x1200. So not sure what these supposed "bugs" are about SLI and GoW. I think i'm going to go ahead and bite the bullet and go quad since i'll get credit for 369 bucks per GTS512. Now i need to find a new PSU...Something tells me the hx 520 isnt gonna cut it.
 
Also, PSU's may be a deciding upgrade factor(Unless of course you just picked up one of those CM Stacker/PSU Combo's). But on a side note, from what I gather, unless I see appreciable improvements over a typical SLI, it may just very well be out of reach for most, as 1200 for Quad SLI is really a B@LL Breaker, just to play Crysis at what, 30% maybe in FPS.
 
Not a bad showing overall. Better than I expected, in fact, but the very thought of SLi still makes me cringe. Maybe I'm just stuck in the past, but I feel like NVIDIA still has a great deal to prove to customers regarding SLi, especially given that we've seen some less-than-pleasant happenings in the past with the GX2 moniker.

I'd like to see some single card vs. GX2 evaluations at some point in the future, but this article actually tells me all I need to make my 'no' decision. Maybe another evaluation will be able to turn that 'no' into a 'maybe start considering thinking about it' (if that isn't too optimistic ;) )

Still, this isn't the card NVIDIA should have launched. It's severely disappointing in that respect.
 
It's interesting to see the complaints about the price of this card. It’s $600 at release, which isn’t that bad I think. It’s by no means the most expensive high end part launch. If you have an SLI motherboard and aren’t planning to go quad, there are much cheaper options. I don’t have an SLI motherboard and don’t want one now, so the 9800GX2 is the fastest thing for people like me. It’s definitely performing better than the rumors and its significantly faster and cheaper than the 8800 Ultra of ten months ago.
 
How does it stack up against a pair of 8800GTXs though, since those are going for around 300 each used and 350 each new. That would be a good comparison. Someone posted earlier a site that did just that, but they only showed the GX2 benchmarks in detail, all it showed was that the 8800GTX SLI setup was faster with everything on medium. But what about higher settings.
 
Ohhh, this is not what I was hoping to see.:( On the CoD4 chart with the exception of the water bug it performed almost exactly on par with the x2, and even at 2560x1600, the GTS SLI was able to use AA/AF. Looks like I will still be getting dual GTS's for my next build soon unless something better comes out.:rolleyes:
 

That link didnt work but I went to that site and checked out their review, but they didnt test the GTX in SLI. I am sure that there will be some tri-sli vs quad-sli vs crossfirex reviews out when the drivers get released next week, but I hope they throw in regular SLI just for kicks.

Also, has anyone else noticed that some sites are getting around 40fps with the GX2 while others are getting almost 50fps with the same settings? I wonder if some sites are using dx9 while others are using dx10?
 
What is the noise level of this card compared to a g92 GTS?

I doubt there is a waterblock anytime soon for it.
 
From the reviews...

Using this review's numbers: http://publish.it168.com/2008/0315/20080315000604.shtml
On average, the GX2 is currently 41% faster than the 3870x2 at 1920 x 1200 res.

I still think the GX2 is the best single card to have right now...

Using the anandtech review's numbers: http://anandtech.com/video/showdoc.aspx?i=3266&p=1

Putting the 8800 Ultra against the 9800 GX2...

On average, the GX2 is currently 52% faster than the 8800 Ultra at 1920 x 1200 res.

41% faster than the 3870x2
52% faster than the 8800 Ultra
was able to compete with 2XHD 3870, 8800GT SLI, 8800GTS 512MB SLI, and 9600GT SLI
and still not enough! &#8230;..what can I say &#8230;. enthusiasts

it's Nvidia fault, it had to be 200% faster than any single card and 100% faster than any SLI/Crossfire setup, and cost lees than $500

okay okay we get it, SLI is better bang/buck deal, go pick those SLI mobo's, catch you at the "my mobo is bottlenecking my OC, I fried my mobo" section :D

edit:
thanks
 
i'm most excited by the release of the 790i ultra.


yep, i don't mind my 650i chipsets. it'd be nice to slam the 8800s into a new sli board worth its ultra moniker... nice shiney new quad, some ddr3.... *wakes up*

i'll stay on top of my bills for a few more months and be happy with my current graphics setup for now. besides, i think i'm due for a monitor upgrade first and foremost.

great review, exciting product for intel chipset users.
 
Quote:
Originally Posted by TalonMan
I do think nVidia hit their 30% minimum speed increase over the Ultra!


Quote: DwellerInTheDeeps
How do you figure that?
If the GX2 had a minimum of 30% over the Ultra, then why would Kyle have just told me that it would not be worth obtaining a GX2 if you already own a GTX?
I'd think that a minimum of a 30% over an Ultra, which is already a little faster than a GTX, would be enough to make the opposite recommendation, especially considering he said he's running a GTX as well.
Unless 30% doesn't really add up to all that much, in which case it wouldn't be worth spending that kind of money when you already have a GTX.


From the reviews...

Using this review's numbers: http://publish.it168.com/2008/0315/20080315000604.shtml
It still holds true that the 3870x2 does well in 3DMark06, but slower in actual game play.

If my calculations are correct, and looking only at 1920 x 1200 res... (What I need)

---------------------------------------------- 3870x2 --------- 9800 GX2 --- %Diff
3DMark06 ------------------------------ 14,190 ---------- 14,167 ------ 23 Points lower
Crysis ----------------------------------- 13FPS ---------- 22FPS ----- 69% faster
Lost Planet ---------------------------- 24FPS --------- 31FPS ----- 29% faster
Lost Planet 8xAA/16xAF--------- 11FPS --------- 13FPS ----- 18% faster
Company of Heros ------------------ 56FPS --------- 77FPS ----- 37% faster
Word in Conflict ---------------------- 34FPS -------- 41FPS ------ 20% faster
Word in Conflict 4xAA/16xAF -- 14FPS -------- 26FPS ------ 85% faster
Half Life ------------------------------- 111FPS ------ 115FPS -------- 3% faster
Half Life 8xAA/16xAF ------------- 73FPS -------- 76FPS -------- 4% faster
NeedForSpeed ---------------------- 48FPS ------- 102FPS ----- 112% faster
NeedForSpeed 4xAA ------------- 33FPS --------- 91FPS ----- 175% faster
Unreal Tournament -------------- 112FPS ------- 109FPS -------- 2% slower
Quake4 ------------------------------ 115FPS ------- 120FPS -------- 4% faster
Lost Planet -------------------------- 11FPS --------- 13FPS ----- 18% faster
FEAR 4xAAx16xAF ------------- 87FPS -------- 79FPS ------- 10% slower

On average, the GX2 is currently 41% faster than the 3870x2 at 1920 x 1200 res.

I still think the GX2 is the best single card to have right now...



Using the anandtech review's numbers: http://anandtech.com/video/showdoc.aspx?i=3266&p=1

Putting the 8800 Ultra against the 9800 GX2...

If my calculations are correct, and looking only at 1920 x 1200 res... (What I need)

---------------------------------------------- 8800 Ultra --------- 9800 GX2 --- %Diff
Call of Duty ---------------------------- 70FPS -------------- 110FPS --------- 57% faster
Call of Duty 4Xaa--------------------- 55.7FPS ------------- 83.1FPS ------ 49% faster
Crysis High Quality----------------- 25.8FPS -------------- 39.4FPS ------ 52% faster
Oblivion --------------------------------- 46FPS ----------------- 84.3FPS ----- 83% faster
Oblivion 4XAA ------------------------ 36.1FPS -------------- 63.9FPS ---- 77% faster
Quake Wars -------------------------- 85.2FPS ------------ 120.4FPS ---- 41% faster
Stalker ---------------------------------- 48.8FPS ------------- 73.3FPS ----- 50% faster
Word in Conflict ---------------------- 30FPS --------------- 33FPS ------- 10% faster.

On average, the GX2 is currently 52% faster than the 8800 Ultra at 1920 x 1200 res.

Yo, I think your missing one valid point, and that the 8800GTX/Ultra's have a much wider 384 Bit Buss as compared to 256, and all that Juicey 768 v 512 Megs of VRam that still isn't being utilised optimally. Once you Factor in its ability to stay and Game with anything right now bespeaks wonders of Nvidia's "Things done Right" Philosophy" and I only Look Forward to when the Real Games that Nvidia has been collaborating with Programmers, really start to get optimised. Remember it takes a couple of Y E A R S for Software to really catch up with hardware.
 
Another Great review compared to the other sites'.

I have one comment though about the Crysis benchies. I was under the impression that the only significant graphical differences between DX10 and DX9 in Crysis occurred when VERY HIGH was selected, since only HIGH and MEDIUM settings were used in Crysis running in DX10 for this review, why weren't DX9 benchies included? Wouldn't a person in real-life run it in DX9 mode when using HIGH and MEDIUM only since Crysis runs much faster in DX9 mode?
Benchies for Crysis in DX9 would be much more useful for a user today since no VERY HIGH settings were used in DX10 mode. DX10 benchies will be useful when comparing with the compeletely redesigned next gen cards that *can* actually run Crysis in DX10 mode with VERY HIGH settings.

I also agree with Michaelius' point that one of the ATI participants in the comparo should also have been a tri-fire setup with a 3870 + 3870x2 since they would be more in the price range of the 9800gx2. Since SLI 8800's were used, I can't see why dual card tri-fire couldn't have been included as well for fairness. If this setup was also pwned (which would most likely be the case, but not as badly with just one 3870x2) then there would be absolutely no doubt of the superiority of NVidia.
 
Zomg. Also, btw Kyle, I like the new layout of the seperate "Playable at <resolution>" sections. Makes much more sense than the previous layout now that I think about it. Since people tend to have a fixed resolution they will be playing at, and do not want to go "non-native" for the sake of details.


As do I.The new layout is worlds better then before.

Excellent reivew,one of the best.I think the price is justified here with this new card.More so once you factor in NOT having to buy a SLi mobo,and the convienance that offers,etc..
But I know that many will never be satisfied with anything,so it doesnt matter in the end.

I may pick one up,but likely wont,as I am sure something considerably faster will hit the market later this year.Besides my current card is plenty fast.I understand the name,but dont think its warranted given what it is.
 
A lot of people would argue on end about the GX2 being a "single card", I'd use the term single "display adapter".

Technically, it is one video card but it does use two PCB's.

I will just chime in on 2 things:

1) Why on earth would they call it a 9800 series? It is just the G80 shrunk down to "Supposedly" increase performance and use less power. its just a product refresh. Not a new Product so to speak.

I have no idea. That's marketing people for you.

Shame on you Nvidia. I was hoping for another breakthrough product like when the G80's came out in 2006. I have a feeling that ATI will leave them in the dust at some point in the near future....

Get over it. MOST of the time the performance increase from one generation to the next is only about as good as we are seeing today if we are lucky. The gap between the 7950GX2 and the 8800GTX was a freak occurence that happens only once in awhile. As for ATI leaving them in the dust, I doubt it will happen anytime soon considering how far behind NVIDIA ATI currently is.

The drivers are not available for Quad SLI until the 25th. So we'll know then.

Exactly.

I wish someone would test it on a non-SLI Intel board....

Why? Do you think that would honestly make that much difference? The performance differences between Intel and NVIDIA chipset based boards isn't that much and this is especially true when GPU testing. Only at low resolutions would the difference be apparent.

Ok, I see some hostility around here most of the time towards certain hardware (for whatever reasons), but coming from the HardOCP mainboard dude this quite shocked me.

What is it that makes the 780i so terrible then? Is it the stability, life expectancy, amount of DOAs or overclockability? Would anyone who isn't going to overclock at all have to 'avoid' the 780i motherboards? Is there any reason for someone looking at a 780i+8800GT SLI setup to not buy it, when he doesn't care about overclocking?

The 680i SLI reference boards are pretty terrible or at least a very large number of them are. Quality control seems inconsistent, and their reliability is low. The DOA rate seems much higher than that of other motherboards. Bear in mind I've owned 12 680i SLI chipset based boards and 11 of those are 680i SLI reference boards. To this day, only one of them still works. The 12th board was an ASUS Striker Extreme which finally died on me recently. Many people have to replace their 680i SLI boards every two or three months. They burn up, fry RAM and have some occasional stability issues. Their voltage regulation is poor and they've still got a number of BIOS issues. The factory cooling solution on 95% of these boards is pretty crappy. Guess what? The 780i SLI chipset based reference boards are more of the same. I've worked with 780i SLI boards a few times now. I've worked with one here at the [H] and I've worked with a few of them in friends systems because they upgraded or built new systems around them. Again more of the same.

The 780i SLI reference boards use the same exact silicon as the 680i SLI MCP and SPP's do. The only difference is an added chip for PCIe 2.0 compatibility and additional PCI-Express lanes. The reference boards have a very similar design to that of the previous generation 680i SLI reference boards. They still use traditional caps for the most part instead of the newer solid caps and a 6-phase power design while virtually every other motherboard maker producing enthusiast solutions has switched to 8-phase. The 780i SLI reference boards have now been in the hands of the end users for long enough for the frying RAM problems, BIOS issues, stability problems, high DOA rates and other associated problems to become apparent. I do not trash the current NVIDIA chipset based boards out of some hatred, malice or bias against NVIDIA. In fact I staunchly defend NVIDIA as a company in multiple threads on multiple occasions. I've purchased a staggering number of their products for businesses I've worked for as well as in my home and gaming boxes. On the surface the feature set is comparable to Intels' best offerings and you get SLI support as well as great overclocking (most of the time) and most of the boards have a decent layout. I am also quite fond of the BIOS options found on all of these boards. I've also given a number of other 600-series reference boards high praise in reviews. The 680i LT also got praise on this site (although not by me.) and I've recommended the 680i LT boards to a number of people.

I'll be looking at the 790i Ultra SLI chipset based boards real soon. I've heard good things about them from Kyle, and I very much look forward to working with it. I had high hopes that the 780i SLI chipset was going to be everything the 680i SLI should have been but unfortunately that didn't happen. Since the silicon is the same that's no surprise. Now the 790i Ultra SLI has to be a new north bridge design because the memory controller supports DDR3. So we'll see how that goes.

I sure hope that Nvidia can do more with the 790i and DDR3, and while they're at it, fix some of the issues that have plagued their other products, ie: Poor PWM Cicuitry, Heat and of course Addon Board Goodness. Then it might be worth looking at seriously.

Not a bad showing overall. Better than I expected, in fact, but the very thought of SLi still makes me cringe. Maybe I'm just stuck in the past, but I feel like NVIDIA still has a great deal to prove to customers regarding SLi, especially given that we've seen some less-than-pleasant happenings in the past with the GX2 moniker.

I'd like to see some single card vs. GX2 evaluations at some point in the future, but this article actually tells me all I need to make my 'no' decision. Maybe another evaluation will be able to turn that 'no' into a 'maybe start considering thinking about it' (if that isn't too optimistic ;) )

Still, this isn't the card NVIDIA should have launched. It's severely disappointing in that respect.

I can agree with some of your thoughts here, but not all of them. SLI isn't as bad as many people (who probably don't even use it) make it out to be. Unfortunately they still don't have the whole multi-monitor and SLI-mode problem taken care of yet. After four years this particular issue pisses me off quite a bit. As for the original 7950GX2, I can agree. It was a piece of shit. Not hardware wise but driver wise. NVIDIA further screwed the pooch when it came to Vista drives and the 7950GX2.

So far things seem better this time around.

As for this not being the card NVIDIA should have delivered, I could not disagree with you more. The card is fine but people's expectations are way out of whack.
 
Back
Top