BFGTech GeForce 9600 GT OC @ [H]

Im really thinking of purchasing this card, maybe even SLI them. Just got my tax refunds and may just purchase this card. Any news on a 9800GTX, 9800GT, or 9800GTS? I know there is a 9800GX2 coming soon.
 
It looks like a better value than the 8800GT to me. It seems pretty common that users are getting some aftermarket cooling for those cards, because they run so hot, which can add $20-$50 to the price of the card depending on the cooler. I was most surprised how cool the 9600GT ran...no need to get an aftermarket cooler even for some OC'ing. Seeing that you can probably get close to 8800GT performance with an OC, this card seems like it's easily the best bang for the buck out there, especially if e-tailers don't jack up the price.


Low power usage + low heat output + great performance in its price class = win in my book. :cool:
 
Dugg

and I'm confused, it has almost 50% of the shader power of the 8800GT yet somehow it's not far behind it? does that mean that the cards are limited by something else? ROPs? texture units? what is it ><? it's confusing!

value wise it's not worth it IMO! 8800GT Cards are hovering @ low 200s, sometimes below 200s for a galaxy models :p Unless these cards are priced @ 150!
 
Finally some nice GPUs under $200 that can play virtually anything. Lets hope it stays that way for a few years! Too long we've had to decide between $400 and up or lackluster gaming.

I think if you never plan to xfire/SLI this is "the card". However due to (IMO) better Intel motherboards, if you are planning on multiple GPUs or possibly going that path, I'd choose 3870.

Multiple GPU = 3870 (for the motherboard)
One GPU = 9600GT

My .02 cents and nice review (can't find anything to nit pick) :(
 
I'm at work so I have only had time to skim the review, but is there any mention of HDTV support? I have an HTPC with an ATI 9600XT fails at S-Video, VGA at 1360x768 (my tv's max res), and RGB at 1080i or 720p has problems.

If the NVidia does HDTV output well I'd love replacing my ATI 9600 with a NVidia 9600
 
Finally some nice GPUs under $200 that can play virtually anything. Lets hope it stays that way for a few years! Too long we've had to decide between $400 and up or lackluster gaming.

I think if you never plan to xfire/SLI this is "the card". However due to (IMO) better Intel motherboards, if you are planning on multiple GPUs or possibly going that path, I'd choose 3870.

Multiple GPU = 3870 (for the motherboard)
One GPU = 9600GT

My .02 cents and nice review (can't find anything to nit pick) :(
Where do you place the 8800gt in that decision?
 
I'm at work so I have only had time to skim the review, but is there any mention of HDTV support? I have an HTPC with an ATI 9600XT fails at S-Video, VGA at 1360x768 (my tv's max res), and RGB at 1080i or 720p has problems.

If the NVidia does HDTV output well I'd love replacing my ATI 9600 with a NVidia 9600


Should be no problem plugging this card into a HD set. It has the "PureHD" video technology built into the GPU. So acceleration is there for HD-DVD or Blu-ray as well.
 
Where do you place the 8800gt in that decision?

I don't. I'm guessing that the o/c will get you around 8800GT speeds, so IMO (read: my opinion) why bother? Less heat, smaller footprint, less power and almost same performance once o/c'd.
 
I don't. I'm guessing that the o/c will get you around 8800GT speeds, so IMO (read: my opinion) why bother? Less heat, smaller footprint, less power and almost same performance once o/c'd.
I don't think you can get the same performance, however, since you have so much "less silicone" as describe above.
 
Probably right.. but that was kinda/sorta a huge overclock and stable at that. Still, might be in the ballpark if you extrapolate the data. Of course Brent will let us know at a later date so we'll see.
 
Probably right.. but that was kinda/sorta a huge overclock and stable at that. Still, might be in the ballpark if you extrapolate the data. Of course Brent will let us know at a later date so we'll see.
Definitely looking forward to that article Brent!! :D
 
What overclocking utility is used in [H] reviews? I don't recognize it :confused: Thats not coolbits is it?
 
Kyle:

You said you will give it the gold award if you see it at $169 or lower next week. Do you think the quantity's of these cards are enough where it can drop that fast? Im really interested in it and if its going to drop $10 within a week, Ill rather hold off and wait to next week.
 
Kyle:

You said you will give it the gold award if you see it at $169 or lower next week. Do you think the quantity's of these cards are enough where it can drop that fast? Im really interested in it and if its going to drop $10 within a week, Ill rather hold off and wait to next week.


I don't know (we get continually lied to about how great supply is to the point I don't listen anymore), but I know if I had to buy a card today, I would very likely buy that $199 GT from MSI. If you wait, I think you will get a much cheaper 9600 in the coming weeks.
 
I can't find anywhere in nTune that allows me to overclock (hense the confusion). Where exactly is it? I've installed it and when I go to the "Change overclocking configuration" all I get is statistics.
 
Well if they are selling on the EGG for 179 on release id expect to see something at 169
 
how loud is it? no review I've seen so far seem to mention it. as I am completely unsatisfied with the loudness of my 7600gt will it be quieter than it?
 
Nice review!

If the price was cheaper on these, I would have purchased one this morning. Currently, it's too close to the 8800GT. At $150-160, this would be a great mid-range card.
 
Keep in mind that our numbers will be higher than what readers see in gameplay situations, as physics, AI, and other overhead will come into play.

Let's play guess the website this line is quoted from... Then again, lets not. Just found this one sentence very interesting in light of a recent article here on the [H].
 
I'm glad I waited before jumping on an 8 series card. Now I know where my tax return money is going. :D
 
XFX, if you can see this :D, sell us a XXX version at the $189 price point and ill consider buying 2

Excellent review guys, makes my day to see some new hardware spankin some tush!
 
Hey, am I correct that the CPU you used is about $1000+ retail?

Why do you guys test with a $1000+ cpu only? I wonder what percentage of people on the forum have a CPU on par with the x6800, and what percentage have something more in the 200-300 dollar range? Just curious what impact the CPU has on your tests, and if you provided tests with 2 different CPUs that might be helpful.

Enjoyed the read though, thanks. :)
 
It didn't make any noise that was worth noting.

Either you guys have serious ear wax problems or I got "golden ear drums". This because every card you label as "silent/quiet" I find annoyingly loud. 8800GTS was an exception in this rule. Too much benching "dust busters" ? :D

Anyways seems like a killer card this 9600GT, too bad the PC gaming is going through its dark ages. If Peter Molyneux thinks so, it must be true :p
 
Hey, am I correct that the CPU you used is about $1000+ retail?

Why do you guys test with a $1000+ cpu only? I wonder what percentage of people on the forum have a CPU on par with the x6800, and what percentage have something more in the 200-300 dollar range? Just curious what impact the CPU has on your tests, and if you provided tests with 2 different CPUs that might be helpful.

Enjoyed the read though, thanks. :)

We are evaluating a video card therefore we don't the CPU to be a bottleneck. That is why we use high end CPUs.
 
Why do you guys test with a $1000+ cpu only? I wonder what percentage of people on the forum have a CPU on par with the x6800, and what percentage have something more in the 200-300 dollar range? Just curious what impact the CPU has on your tests, and if you provided tests with 2 different CPUs that might be helpful.

Both to remove the CPU bottleneck and because a very good percentage of the people on this board have this or greater CPU power. The $1000 X6800 is 2.93GHz...my $266 E6850 is 3.0GHz, and they're both Core 2 Duos so they're directly comparable. ;) Lots of other people have an E6600, E6750, E8xxx, or Q6600 that they've overclocked to 3.0GHz or above, again directly comparable (given that most games don't really take advantage of quad cores right now).
 
We are evaluating a video card therefore we don't the CPU to be a bottleneck. That is why we use high end CPUs.

That makes sense, but then the article still doesnt completely answer the question about how the video card will perform for me. ;)
 
Hey, am I correct that the CPU you used is about $1000+ retail?

Why do you guys test with a $1000+ cpu only? I wonder what percentage of people on the forum have a CPU on par with the x6800, and what percentage have something more in the 200-300 dollar range? Just curious what impact the CPU has on your tests, and if you provided tests with 2 different CPUs that might be helpful.

Enjoyed the read though, thanks. :)

Exactly what I was goona post. A $200 card in a system already composed of over $1500in other parts? Don't think so. Most will have at minimum an 8800 512MB GTS. If this was tested fairly on an E6400 even the E8400 with other midrange components, I think this card in general TANKS. Raptor drives? Dominator memory? No, sorry this just doesn't fit the bill at all.:eek:
 
Both to remove the CPU bottleneck and because a very good percentage of the people on this board have this or greater CPU power. The $1000 X6800 is 2.93GHz...my $266 E6850 is 3.0GHz, and they're both Core 2 Duos so they're directly comparable. ;) Lots of other people have an E6600, E6750, E8xxx, or Q6600 that they've overclocked to 3.0GHz or above, again directly comparable (given that most games don't really take advantage of quad cores right now).

Interesting... so I guess my 6000+ at 3GHz is pretty comparable too? Man, I find the CPU naming/speed/marketing so confusing! Why is a $1000 cpu comparable to a $200 one?
 
We are evaluating a video card therefore we don't the CPU to be a bottleneck. That is why we use high end CPUs.


Thats fine and all. But the fact remains, most people are not going to run a midrange card in a HIGH END system. This card in a midrange system would give ppl the REAL WORLD performance. Thats what is preached here, isn't it??

I am not starting an arguement here, just test the card with what it really is gonna be in.:)
 
Thats fine and all. But the fact remains, most people are not going to run a midrange card in a HIGH END system. This card in a midrange system would give ppl the REAL WORLD performance. Thats what is preached here, isn't it??

I am not starting an arguement here, just test the card what it really is gonna be in.:)

I agree. Not trying to dump on the review, just wondering how accurately it really reflects typical usage of the card. He's right - who would buy a $1000 CPU and then a <$200 video card? But then, maybe this isn't a problem with the review, but more of a problem with how they price CPUs of comparable performance.

;)
 
I don't understand why you guys run Crysis in DX10 mode. Whenever a post in your gaming section deals with Crysis/Crysis gaming performance/Crysis configs, DX9 is always recommended. I would go out on a limb and say a majority of your core readers don't find your Crysis numbers particularly useful.

I believe you guys stated that you run Crysis in DX10 mode is because it defaults to that mode out of the box. Well, that doesn't seem to go well with what your [H] reviews are about, trying to enable the most eye candy while having an acceptable frame rate (since DX9 offers more eye candy with less of a performance penalty). So what's the deal, why do you guys test Crysis in DX10, when your readers most likely don't play the game with that API, and it's not very [H]ard of you to use inferior default settings?.

Maybe because, DX10 is the better eye candy than DX9 no matter how much you crank DX9. There is no comparison between the 2. DX10 is and does look better.
 
Interesting... so I guess my 6000+ at 3GHz is pretty comparable too? Man, I find the CPU naming/speed/marketing so confusing!

No, you can't directly compare an AMD K8 CPU with an Intel Core 2 Duo CPU. I believe that your 6000+ at 3GHz is pretty comparable to a Core 2 Duo E6600 at stock at 2.4GHz.

Why is a $1000 cpu comparable to a $200 one?

X6800 = Original Core 2 Duo high-end; mid-2006; 65nm, 1066 FSB; $1000 MSRP
E6850 = Highest-end C2D at mid-2007 refresh; 65nm, 1333 FSB; $266 MSRP
E8400 = Second-to-highest C2D early-2008 refresh; 45nm, 1333 FSB; $183 MSRP

All three are roughly 3GHz.

Tech improves, prices drop, it's the way of the (tech) world. It's like asking why an AM2 X2 6000+ is cheaper and faster than the 939 X2 3800+ at launch (which I got for $350 in late 2005).

Not to mention that, from the start, 3.0GHz has been a relatively easy overclock for most C2D's since launch (when even the 2.4 GHz E6600 cost more than my E6850 did when I got it).
 
Back
Top