Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Did you actually say "mad potential" ? Why not through a "yo" in there, as well.
Did you actually say "mad potential" ? Why not through a "yo" in there, as well.
this card should be labelled 9600gtx
Where do you place the 8800gt in that decision?Finally some nice GPUs under $200 that can play virtually anything. Lets hope it stays that way for a few years! Too long we've had to decide between $400 and up or lackluster gaming.
I think if you never plan to xfire/SLI this is "the card". However due to (IMO) better Intel motherboards, if you are planning on multiple GPUs or possibly going that path, I'd choose 3870.
Multiple GPU = 3870 (for the motherboard)
One GPU = 9600GT
My .02 cents and nice review (can't find anything to nit pick)
well I have to say.....
made a bad move on getting a 8800gts 320mb sli.....
I'm at work so I have only had time to skim the review, but is there any mention of HDTV support? I have an HTPC with an ATI 9600XT fails at S-Video, VGA at 1360x768 (my tv's max res), and RGB at 1080i or 720p has problems.
If the NVidia does HDTV output well I'd love replacing my ATI 9600 with a NVidia 9600
Where do you place the 8800gt in that decision?
I don't think you can get the same performance, however, since you have so much "less silicone" as describe above.I don't. I'm guessing that the o/c will get you around 8800GT speeds, so IMO (read: my opinion) why bother? Less heat, smaller footprint, less power and almost same performance once o/c'd.
Definitely looking forward to that article Brent!!Probably right.. but that was kinda/sorta a huge overclock and stable at that. Still, might be in the ballpark if you extrapolate the data. Of course Brent will let us know at a later date so we'll see.
What overclocking utility is used in [H] reviews? I don't recognize it Thats not coolbits is it?
Brent said:Lets first jump on overclocking testing, because this video card impressed us in this area quite a bit. We used NTune to overclock with.
Kyle:
You said you will give it the gold award if you see it at $169 or lower next week. Do you think the quantity's of these cards are enough where it can drop that fast? Im really interested in it and if its going to drop $10 within a week, Ill rather hold off and wait to next week.
how loud is it? no review I've seen so far seem to mention it. as I am completely unsatisfied with the loudness of my 7600gt will it be quieter than it?
Keep in mind that our numbers will be higher than what readers see in gameplay situations, as physics, AI, and other overhead will come into play.
It didn't make any noise that was worth noting.
Hey, am I correct that the CPU you used is about $1000+ retail?
Why do you guys test with a $1000+ cpu only? I wonder what percentage of people on the forum have a CPU on par with the x6800, and what percentage have something more in the 200-300 dollar range? Just curious what impact the CPU has on your tests, and if you provided tests with 2 different CPUs that might be helpful.
Enjoyed the read though, thanks.
Why do you guys test with a $1000+ cpu only? I wonder what percentage of people on the forum have a CPU on par with the x6800, and what percentage have something more in the 200-300 dollar range? Just curious what impact the CPU has on your tests, and if you provided tests with 2 different CPUs that might be helpful.
We are evaluating a video card therefore we don't the CPU to be a bottleneck. That is why we use high end CPUs.
Hey, am I correct that the CPU you used is about $1000+ retail?
Why do you guys test with a $1000+ cpu only? I wonder what percentage of people on the forum have a CPU on par with the x6800, and what percentage have something more in the 200-300 dollar range? Just curious what impact the CPU has on your tests, and if you provided tests with 2 different CPUs that might be helpful.
Enjoyed the read though, thanks.
Both to remove the CPU bottleneck and because a very good percentage of the people on this board have this or greater CPU power. The $1000 X6800 is 2.93GHz...my $266 E6850 is 3.0GHz, and they're both Core 2 Duos so they're directly comparable. Lots of other people have an E6600, E6750, E8xxx, or Q6600 that they've overclocked to 3.0GHz or above, again directly comparable (given that most games don't really take advantage of quad cores right now).
We are evaluating a video card therefore we don't the CPU to be a bottleneck. That is why we use high end CPUs.
Thats fine and all. But the fact remains, most people are not going to run a midrange card in a HIGH END system. This card in a midrange system would give ppl the REAL WORLD performance. Thats what is preached here, isn't it??
I am not starting an arguement here, just test the card what it really is gonna be in.
I don't understand why you guys run Crysis in DX10 mode. Whenever a post in your gaming section deals with Crysis/Crysis gaming performance/Crysis configs, DX9 is always recommended. I would go out on a limb and say a majority of your core readers don't find your Crysis numbers particularly useful.
I believe you guys stated that you run Crysis in DX10 mode is because it defaults to that mode out of the box. Well, that doesn't seem to go well with what your [H] reviews are about, trying to enable the most eye candy while having an acceptable frame rate (since DX9 offers more eye candy with less of a performance penalty). So what's the deal, why do you guys test Crysis in DX10, when your readers most likely don't play the game with that API, and it's not very [H]ard of you to use inferior default settings?.
Interesting... so I guess my 6000+ at 3GHz is pretty comparable too? Man, I find the CPU naming/speed/marketing so confusing!
Why is a $1000 cpu comparable to a $200 one?