BFGTech GeForce 9600 GT OC @ [H]

Thats fine and all. But the fact remains, most people are not going to run a midrange card in a HIGH END system. This card in a midrange system would give ppl the REAL WORLD performance. Thats what is preached here, isn't it??

I am not starting an argument here, just test the card with what it really is gonna be in.:)
That's a bad idea, because while Real-world gameplay testing cannot deal with all variables because it accepts those variables as PART of the test, having a cpu-limit would confound your results.
The CPU can control things like physics etc that can also affect framerates. You want these things to be non-issues, otherwise they can give false framerate dips etc.

Say, when a tank explodes in crysis. You have two things going on:
A: The CPU is controlling the physics of the pieces
B: The gpu is controlling the display of the flames, lighting shadows etc.

if there is a CPU limit on the physics part, the framerates can be artificially lowered, making you think that the card cannot handle the lighting, hdr, shadows etc of the scene. Thus lowering the perceived performance.



MAN, did people not read the whole real-world gameplay article at all?
 
Using a high end CPU in their reviews makes sense because of several reasons.

1) It keeps the rest of the box consistant when comparing video cards. This way you can look at the review for a 8800GTX and 9600GT and know that the performance difference you see is strictly the video card, not the CPU or anything else.

2) It strips the performance down to just the video card. If you had a lower-end CPU in there, it would complicate things greatly. You may end up seeing misleading information like "well, card X and card Y both performed the same in game Z! Looks like they are very comparable cards!" when really the game happened to run into CPU limitations, which made the video card become a moot point. You could run a 8800 ulta against a 3870 and say "look, they have the same performance in crisis!" if you were limiting your performance with your CPU.

3) They like to play with expensive toys.

On another note, this review makes me feel like an idiot for buying a 3870 a few days ago.

HOWEVER, I do think there is some merit to looking at the performance of more realistic systems, just not in a video card review. Maybe [H] could build a few boxes and benchmark entire setups. For example, a midrange core 2 duo system, with a very common S775 motherboard (ASUS, abit, whatever has 500+ review on newegg), with very common ram (corsair, mushkin, again whatever has 500+ reviews on newegg), and so on down the line until you've built your "common gamer box."

That's about as real world as you can get, and you could also use it to help people make even more educated decisions. Maybe, for example, it would show that somebody running a midrange core 2 duo box actually doesn't get much extra performance out of a 8800 ultra as compared to an 8800GT because the rest of the box limits it.

Just a thought.
 
Using a high end CPU in their reviews makes sense because of several reasons.

1) It keeps the rest of the box consistant when comparing video cards. This way you can look at the review for a 8800GTX and 9600GT and know that the performance difference you see is strictly the video card, not the CPU or anything else.

Thanks for that... this point kind of makes sense to me.
 
X6800 = Original Core 2 Duo high-end; mid-2006; 65nm, 1066 FSB; $1000 MSRP
E6850 = Highest-end C2D at mid-2007 refresh; 65nm, 1333 FSB; $266 MSRP
E8400 = Second-to-highest C2D early-2008 refresh; 45nm, 1333 FSB; $183 MSRP
All three are roughly 3GHz.
Not to mention that, from the start, 3.0GHz has been a relatively easy overclock for most C2D's since launch (when even the 2.4 GHz E6600 cost more than my E6850 did when I got it).


So bascially your saying everyone here and out there is running 3Ghz? Sorry, haven't seen that mark on my own computer, ususally around 2.5-2.6Ghz. Valves own survey doesn't support that arguement at all. Were saying, we have seen them bench different items on systems like an E6300 before and that where this card should be. Not with raptor drivers or dominator memory, a regular mid range system that joe blow is gonna have. Which there are alot more than any 3Ghz machines, thank you.
 
So bascially your saying everyone here and out there is running 3Ghz?

No, just here. This is [H]ard|OCP, a hardware enthusiasts' site, remember? ;) Look at all the sigs. I have no doubt that outside of these hallowed walls, most people do not overclock. This site is geared towards those who do though, or at least those who upgrade from time to time (pretty cheap to get 3GHz stock C2D now, with the E8400, or even the E6850 before).

The main point of my comparison though was to answer xappie's question of how a $200 CPU could be comparable to a $1000 CPU. The answer is that technology evolves and gets cheaper over time, as I showed by my progression.
 
So bascially your saying everyone here and out there is running 3Ghz? Sorry, haven't seen that mark on my own computer, ususally around 2.5-2.6Ghz. Valves own survey doesn't support that arguement at all. Were saying, we have seen them bench different items on systems like an E6300 before and that where this card should be. Not with raptor drivers or dominator memory, a regular mid range system that joe blow is gonna have. Which there are alot more than any 3Ghz machines, thank you.
The E6300 is the cpu used for mobo reviews, not video card reviews.

You have to keep in mind WHY certain items are used. The E6300 is used on the mobo reviews because it is a chip [H] knows to CO well and consistently. Also, it is a very popular CPU.
Second, refer to my last post why you want to keep everything but the video card a non-issue. I.E. to prevent confusing your numbers and performance.
 
1) It keeps the rest of the box consistant when comparing video cards. This way you can look at the review for a 8800GTX and 9600GT and know that the performance difference you see is strictly the video card, not the CPU or anything else.

2) It strips the performance down to just the video card. If you had a lower-end CPU in there, it would complicate things greatly. You may end up seeing misleading information like "well, card X and card Y both performed the same in game Z! Looks like they are very comparable cards!" when really the game happened to run into CPU limitations, which made the video card become a moot point. You could run a 8800 ulta against a 3870 and say "look, they have the same performance in crisis!" if you were limiting your performance with your CPU.

That's a bad idea, because while Real-world gameplay testing cannot deal with all variables because it accepts those variables as PART of the test, having a cpu-limit would confound your results.
The CPU can control things like physics etc that can also affect framerates. You want these things to be non-issues, otherwise they can give false framerate dips etc.

The whole point at the [H] is real word gameplay, this contradicts your posts. In the real world people don't have a $1000 cpu for physics and then purchase a $180 card. I too think a card should be reviewed more or less on a system it's likely going to go into. THAT's a real world eval.

Edit: You edited your post to reflect the above

That said, this is still the best review system out there and a better review than I could ever do.
 
The whole point at the [H] is real word gameplay, this contradicts your posts. In the real world people don't have a $1000 cpu for physics and then purchase a $180 card. I too think a card should be reviewed more or less on a system it's likely going to go into. THAT's a real world eval.

No, but most are running on chips that run near 2.6-3 ghz. And don't confuse real-world for unscientific. You can still control as many variables as possible, and you HAVE to work hard not to confound your data or performance numbers with false negatives.
And you don't have to have a high end cpu for physics, just a multi-core one.
That said, this is still the best review system out there and a better review than I could ever do.
Amen, [H] ftw, dugg
 
Looks to be a nice card.... assuming its sold in its MSRP range. $169-$189
at $170 that will be a nice bang for buck card.

The BFG 9600GT looks too expensive at anything over $200.
 
hi is this a big boost over what i currently got or should i got for a gt? thanks.
 
The whole point at the [H] is real word gameplay, this contradicts your posts. In the real world people don't have a $1000 cpu for physics and then purchase a $180 card. I too think a card should be reviewed more or less on a system it's likely going to go into. THAT's a real world eval.

Edit: You edited your post to reflect the above

That said, this is still the best review system out there and a better review than I could ever do.
I have to agree. People who buy budget minded cards have budget minded systems. So to say this represents real world usage of a card in the "budged minded" user is incorrect. I always wondered why these cards are tested on maxxed out systems. In the real world if someone comes close to owning the parts in this system are probably not buying this card. Just an observation. How about having 3 test systems? One Budget / One Midrange / One High-End and run the card through ALL systems. That would be a "real world" evaluation of the video card. Not running a High-End system on a budget minded card.
 
This is a very nice evaluation; thanks. I wish you also ran a FPS test on all 4 cards using the same settings for all of the games, as "reviews" do. I realize you think it's more important to pick a playable FPS and try to max out the video quality as much as possible before game play suffers, hence the "evaluation" nature.

I have a 2.5 y/o 6800GT video card that needs to be upgraded soon. Even though I don't game much, I'm looking at a sub-$200 video card in the spring/summer time frame. It could very well come between the 9600 GT and 8800 GT (512 MB).
 
I have to agree. People who buy budget minded cards have budget minded systems. So to say this represents real world usage of a card in the "budged minded" user is incorrect. I always wondered why these cards are tested on maxxed out systems. In the real world if someone comes close to owning the parts in this system are probably not buying this card. Just an observation. How about having 3 test systems? One Budget / One Midrange / One High-End and run the card through ALL systems. That would be a "real world" evaluation of the video card. Not running a High-End system on a budget minded card.

I know you would like to see that stuff...but could you show me an example where there is a serious difference in results between a lower end machine and a higher end machine once the graphics card is pushed to its limits? It might be worth the effort to do it yourself on your own machine and see why maybe it is not typically done. Before requesting a change, you may wish to do a little leg work and try to validate your idea before pushing it on others.
 
Hehehehe. You obviously failed to understand the point. But that's o.k. to each their own.

How can I possible show you the results of such a situation if I haven't seen one to share? The point is if this is a "real world" review. Then let's apply the card to "real world" users who will be purchasing this said card. If I had the money to do what you suggested, I wouldn't be here, I would be running my own hardware site. Didn't realize a "suggestion" could turn into, "Do it yourself jack" ... Nice.
 
Wow. A $170-190 card that makes the similarily priced 3850 and the slightly to somewhat more expensive 3870 look like a waste of money? Yeah. Just, wow. Great review as usual fellas. :)
 
I have to agree. People who buy budget minded cards have budget minded systems. So to say this represents real world usage of a card in the "budged minded" user is incorrect. I always wondered why these cards are tested on maxxed out systems. In the real world if someone comes close to owning the parts in this system are probably not buying this card. Just an observation. How about having 3 test systems? One Budget / One Midrange / One High-End and run the card through ALL systems. That would be a "real world" evaluation of the video card. Not running a High-End system on a budget minded card.

All that would show is when the CPU becomes the limiting factor. This wasn't a system review, CPU review, etc... it was a VIDEO CARD REVIEW. All I care about is how the CARD performs, not at what point the CPU becomes a bottleneck. If you want to see how CPU scaling affects video cards, go read the CPU scaling article where this exact issue is dissected.

Now please, shut up about the CPU choice - WHO CARES? If someone buys a $200 video card to pair it with an ancient CPU then they are an idiot - any recent dual-core (X2 or C2D) 2.5-2.8ghz+ won't be CPU limited, so this whole "discussoin" is moot.

Back to the topic at hand...

I wonder how this would compare to the 8800GS, which is cheaper and has more SP at the cost of RAM bandwidth and size... Anyone have any ideas?
 
All that would show is when the CPU becomes the limiting factor. This wasn't a system review, CPU review, etc... it was a VIDEO CARD REVIEW. All I care about is how the CARD performs, not at what point the CPU becomes a bottleneck. If you want to see how CPU scaling affects video cards, go read the CPU scaling article where this exact issue is dissected.

Actually, it would also serve another useful purpose. Namely, it would answer the question: "If I have xyz CPU, what's the highest-end video card purchase I could make that wouldn't get bottlenecked by the CPU?", or the related, "Is it worth it for me to upgrade to this graphics card given that I have xyz CPU?". So it would be a useful and valuable evaluation...just not a pure video card review anymore. And given how intensive the [H] testing process is, I doubt they could do a GPU/CPU pair evaluation in addition to a pure GPU evaluation.

I wonder how this would compare to the 8800GS, which is cheaper and has more SP at the cost of RAM bandwidth and size... Anyone have any ideas?

From what I've heard, the 9600 GT beats the 8800 GS, albeit not by much. I forget where I heard that though.
 
So bascially your saying everyone here and out there is running 3Ghz? Sorry, haven't seen that mark on my own computer, ususally around 2.5-2.6Ghz. Valves own survey doesn't support that arguement at all. Were saying, we have seen them bench different items on systems like an E6300 before and that where this card should be. Not with raptor drivers or dominator memory, a regular mid range system that joe blow is gonna have. Which there are alot more than any 3Ghz machines, thank you.

Neither a Raptor nor Dominator memory are going to make a difference in the gameplay. The only time a Raptor might do something is if you are constantly paging and have to read/write to the hard drive. If this is the case, then you should probably look at getting more RAM as that's going to be a bottleneck more than any GPU. The Dominator memory won't matter because it's running at speeds much lower than what it's rated for. The only time it would come into play is being able to get guaranteed speeds of the RAM for overclocking. That RAM is running 533 or 667 just like your RAM probably is.

Simple fact is that the 9600GT is a faster card than everything in the review except for the 8800GT. Getting rid of the bottlenecks show this. If you upgrade or overclock your processor anytime soon, you might be able to take advantage of the processing power of the 9600GT if you purchased one. If you purchased a 3850 instead, you wouldn't see any difference or less of a difference.

Most games are still GPU limited at anything other than low resolutions. You should be able to figure out of that's the case for you or not. There are enough people who have commented on the performance of games with difference CPUs to get a damn good idea of how your CPU will perform.

Also, it takes a long time and a lot of work to produce a review like this. If you wish to pay for the extra testing (the hardware, paying the person doing the work as well as the bandwidth costs associated with hosting the article) I'm sure Kyle would happily find a way to have it done.

Final point. Every article is not going to satisfy everyone. You happen to be one of the people who can't be satisfied and there's nothing Kyle can do about that. Kyle sure as hell isn't matching my system configuration and you don't find me complaining about it. Last week I had an [email protected], 2 gig of RAM and a 7600GT. This week I'm running a [email protected], 4 gig of RAM and an 8800GT. There hasn't been a review in a while with a 7600GT in it so according to you, I should be bitching because there wasn't a 7600GT in the reviews. Otherwise there is no way to know how well my current setup compares to the setup they use. I just use some simple logic and realize that the newer cards run better than my old 7600GT. It's not surprise and no matter what I would get a performance boost on the old system with a new video card.

 
I think some of the respones are getting a little over the top in terms of what some of us are just "suggesting"... Isn't that what a forum is about is to discuss and suggest? At Least SmokeRngs has a sense to summarize the +'s and -'s. Instead of getting slammed that you're an idiot, etc...
 
WOW, the misinformed are now stating facts. Seriously, what do most of yous really know about what people would buy? You know why a person who spends $1000 on a cpu chip spends $200 on a video card? Because all he had was $1200 to budget. Some of these arguments are either showing poster's ages or comprehension levels. I person that would spend a lot on one component isn't always gonna spend a lot on the next. That's just pathetic logic.

Also, who cares about buying decisions when you need to remove variables from the testing procedures? Everyone else already explained that aspect so I wont get into it.

This video card is impressive. I kinda like Nvidia's new process to release lower end cards first before their high end parts debut. I do however want to echo some of the issues others have already posted about using DX10 in crysis. If all DX10 does is allow you to use the ultra-high settings, and none of these cards H has reviewed yet can even use those options, why the F are they still using the 64bit DX10 version? It can't be due to reader interest as I'm pretty sure more than half of us [H]'ers are still using XP for gaming.(> 50%, i.e. 51, 52, 60%) So that is a little weird.

Another weird thing is what idiot would try to SLI two $180 video cards? Just buy an Ultra. Some people love to part with their money soon, I forgot what they used to call them... But I mean, if you read this review and said, "Word! I'm gonna sli 2 9600GT's!"; Did you know that faster single gpu cards exist? Even an HD 3870x2 is a better move than a regular SLI setup with two 9600GT's.
 
No SLI numbers?

No. It was my decision not to put in SLI numbers due to buying patterns we have seen in the past. We just don't see people buying two sub-$200 cards at a time to SLI when they can put $400 into a single card solution. - Kyle

I disagree...preliminary tests done on the 9600gt showed that the cards in SLI were faster than a 8800 ultra. For $360, that would be impressive.
 
I'm really amazed by the amount of bitching going on in this thread. The review tells you exactly what the cards limits are, it's power in comparison with similar cards, all w/o allowing the card to be limited by CPU etc. I even get FPS over time graphs in this review! (Personally I think these are incredibly powerful in showing whats really going on!) Great work on the review btw.
 
I disagree...preliminary tests done on the 9600gt showed that the cards in SLI were faster than a 8800 ultra. For $360, that would be impressive.

I would love to see the proof in the form of a link to a reputable website (ie NOT T.H. , the inq, <anything>.blogspot, etc etc.). Also I want to tone down my initial disdain for SLi setups until I see real SLI numbers.
 
Good looking card. I'm interested in actual prices in another couple weeks. especially with the 3870 price drop.
 
very nice review. The price isn't in the right bracket in my opinion.
The 8800GT is still a better bang for the buck indeed
 
lol should've asked this in my previous post but I was rushing to class, but what is quieter this or the 8800gt. as you can tell acoustics is very important to me
 
Funny that right below the blurb about this review, is an entry advertising BFG's GT OC card (of which this review was based on) with a MSRP of $229.99!

That's an outrageous MSRP. Comparable MSRP from evga with similar clocks is $189.99. evga still getting my money.
 
Funny that right below the blurb about this review, is an entry advertising BFG's GT OC card (of which this review was based on) with a MSRP of $229.99!

That's an outrageous MSRP. Comparable MSRP from evga with similar clocks is $189.99. evga still getting my money.

Yeah, the BFG pricing is a little nuts. On the 8800GT, you can buy one in Best Buy at full retail for about what all the other cards were selling for online (although the online prices have finally started to come down). This massive gap on the 9600 just doesn't make sense.
 
I would love to see the proof in the form of a link to a reputable website (ie NOT T.H. , the inq, <anything>.blogspot, etc etc.). Also I want to tone down my initial disdain for SLi setups until I see real SLI numbers.

http://www.nordichardware.com/news,7275.html
This one has only 3DMark scores, however, those tests show that 2 9600gt's in SLI are faster than a 8800 ultra.

http://en.hardspell.com/doc/showcont.asp?news_id=2561
This one has benchmarks, however, it is unclear what exactly they put it up against.
 
I'm really amazed by the amount of bitching going on in this thread. The review tells you exactly what the cards limits are, it's power in comparison with similar cards, all w/o allowing the card to be limited by CPU etc. I even get FPS over time graphs in this review! (Personally I think these are incredibly powerful in showing whats really going on!) Great work on the review btw.
QFT / Awesome evaluation

I really enjoy the subjective impression I form looking at the FPS graphs. Ultimately, the "highest playable settings" are exactly what I care about when contemplating a card.

I guess the only thing left to winge about is introductory pricing being higher than MSRP, but consumers usually tolerate paying a premium for the privilege of getting the latest new thing right F'n now. I really can't blame the distribution channel for making a few bucks; that's just smart business.

For those building a new system this weekend, an 8800GTS is the smart buy because the gap is so small. In a month or so the 9600 will become attractive. I may wait a bit and pick one of these up to add a pair of rotated 19" 1680x1050 displays on either side of my 3007WFP :D
 
evga offers lifetime. they are local so when I sent out an RMA it was done real quick and easy. bfg overcharges for their cards all the time.
 
Rather impressive for what it is, this brings high hopes for the 9800 series.
btw Nice review too.


Agreed ! Excellent card @ 169,at 189 and up,not so much.Still I picked up two of them today @ NCIX for 179 Cdn each,both with decent oc's on them from eVGA. :)

One for the wife's system,upgrade for her 7800GT,and the other as part of a complete system build I am building for a friend from her work.
 
Yeah, I've seen 8800GTS 512MB's going for $260 on NewEgg. Personally though, I'll likely be stepping-up to a 9800GX2 as long as it's not more than half an inch larger than an 8800GTX/Ultra (even being the size of the 8800GTX/Ultra will make fitting it in the case an interesting process, but I happen to have an empty driver bay where the gpu is and 11" up until that point, and the power connector can fit in through the drive bay if it has to) and it hits its rumored launch date. Otherwise, I'll probably step-up to an 8800GTS 512MB (unfortunately, eVGA doesn't seem to allow you to step-up to OCed variants, otherwise the new 8800GTS 512MB SSC edition would be appealing) just because I have the step-up sitting around and it would cost me almost nothing to do it.
 
If NVIDIA keeps this up on the graphics side along with Intel on the CPU side AMD won't be around much longer. Does AMD even offer anything of value at this point compared to a Intel or NVIDIA product? Keep in mind this is coming from a guy still rocking a AMD64 3000+ looking to make an upgrade soon.
 
Actually, it would also serve another useful purpose. Namely, it would answer the question: "If I have xyz CPU, what's the highest-end video card purchase I could make that wouldn't get bottlenecked by the CPU?", or the related, "Is it worth it for me to upgrade to this graphics card given that I have xyz CPU?". So it would be a useful and valuable evaluation...just not a pure video card review anymore. And given how intensive the [H] testing process is, I doubt they could do a GPU/CPU pair evaluation in addition to a pure GPU evaluation.

That was already done in the form of the CPU scaling article - go read it. Leave this thread for the 9600GT - not "where is the bottleneck in my system" - start a different thread for that.

I'm really surprised how close the 9600GT was to the 8800GT though, given the drastic reduction in SPs... Anyone have any ideas where the bottleneck is on G92s then? I mean, if you cut the SPs by ~40% it would be logical to expect about that large of a performance lose - but it isn't there...
 
The whole point at the [H] is real word gameplay, this contradicts your posts. In the real world people don't have a $1000 cpu for physics and then purchase a $180 card. I too think a card should be reviewed more or less on a system it's likely going to go into. THAT's a real world eval.

Edit: You edited your post to reflect the above

That said, this is still the best review system out there and a better review than I could ever do.

We are not going to start evaluating video cards in games that are CPU limited. That's would just be dumb.

That all said, if you have any modern CPU that is dual core running at 2.5GHz+, you are not going to be CPU limited in the games we have shown today at the resolutions we have shown today. So in fact, I would suggest if you meet that criteria, you could get results very similar to ours shown here today. If you feel as though your system is CPU limited, I would suggest going and reading a processor article or two and investing in that next.
 
Back
Top