Redesigned 8800GT - Gigabyte GV-NX88T512HP

Son of a..... note the SPDIF connector for HDMI audio passthrough to your HDTV!!!!

And it clocks better with better power, and is smaller.

Guess Ill be going SLI sooner than I thought. Wish my XFX had the passthru to begin with.

So two of these, on water, should be real sweet.
 
How does this card output HDMI with audio to your TV when it has two DVI outputs?
 
This looks really nice, Looks a lot like the Palit version.

My Big question is, why can't Nvidia get there shit together with a better designed cooler and card like this from the start, instead of making these covering all bake ovens.
 
I hate reference design video cards that differ only in cooler sticker between brands. When a new card hits, I always wait a month or two until companies like GigaByte and others release "cool" and more refined versions of the same card. They usally negate the need for purchasing a 3rd-party cooler since they come stock with one and I like that they ditch the green PCB's. So frickin dull...

I hate buyers remorse, and seeing new updated revisions of video cards hit left and right, right after you had already purchased the john doe version is painful.
 
I hate reference design video cards that differ only in cooler sticker between brands. When a new card hits, I always wait a month or two until companies like GigaByte and others release "cool" and more refined versions of the same card. They usally negate the need for purchasing a 3rd-party cooler since they come stock with one and I like that they ditch the green PCB's. So frickin dull...

I hate buyers remorse, and seeing new updated revisions of video cards hit left and right, right after you had already purchased the john doe version is painful.

Very well said, and it happens EVERY TIME!!!!! Thats exactly why I wait when buying something that just came out.
 
what is wrong with resellers that stick a Zalman fan on their GPUs that they don't cover the RAM chips and MOSFETs with a heatsink? I wouldn't touch that model with a ten foot pole. It's not going to OC much if at all b/c the MOSFETs are going to be boiling hot.
 
The reason Nvidia's reference design uses the cover is so that the air is vented out of the case at the back of the GPU card. This helps OEM manufacturers meet their thermal budgets, etc. This card design you see is just going to blow hot air away from the GPU but into the case, which is what older GPU designs used to do. If you have good cooling in your case, this may not be a problem for you.
 
You see the problem with 3rd party pcb designs is that its harder to find a how-to guide on doing volt-mods to it, unless however it turns out to be a better, more overclockable design, which I will stay away from the speculation on.
 
The reason Nvidia's reference design uses the cover is so that the air is vented out of the case at the back of the GPU card. This helps OEM manufacturers meet their thermal budgets, etc. This card design you see is just going to blow hot air away from the GPU but into the case, which is what older GPU designs used to do. If you have good cooling in your case, this may not be a problem for you.

The 8800GT's reference cooler does not vent the hot air out of the back of the card. It exausts the air right back into the case.

You may be thinking of the GTX and GTS dual-slot coolers which DO exhaust hot air outside of the case.
 
EVGA i think only sticks to referance nVidia design.

yeah, they've proven that with the last dozen product cycles or so! :rolleyes:
it bothers me too, as I feel they have the best warranty service out there pretty much.
 
How does this card output HDMI with audio to your TV when it has two DVI outputs?

DVI and HDMI are essentially the same in their wiring and electrial designs, so the physical connector just needs a DVI-to-HDMI adaptor.

More detail here.
 
Looks more like a 6600GT than anything.
Won't the RAM on that get awfully toasty?
 
Looks more like a 6600GT than anything.
Won't the RAM on that get awfully toasty?

Na not at all, its DDR3. The likes of the 7950GT was referenced released with DDR3 and no sinks.

On the 8800GT the ram is loop warm at stock speeds, and that Zalman does blow cold air across them anyway. I would say the ram will be a LOT cooler than baking under the reference cooler by a long way.
 
what the F is wrong with resellers that stick a Zalman fan on their GPUs that they don't cover the RAM chips and MOSFETs with a heatsink?? They're idiots!! I wouldn't touch that model with a ten foot pole. It's not going to OC much if at all and the RAM and MOSFETs are going to be boiling hot. What crap.

You couldn't be any more uninformed if you actually tried.

1. The Qimonda/Samsung BGA DDR3 modules used on these cards run the coolest of any modules yet and require no heatsink to run at their rated speeds and beyond. This is actually true for most BGA memory.

2. The power regulation circuitry on this Gigabyte card has been redesigned with a cooler running and more stable triple-phase solution. That's one of the whole points of this card.
 
You couldn't be any more uninformed if you actually tried.

1. The Qimonda/Samsung BGA DDR3 modules used on these cards run the coolest of any modules yet and require no heatsink to run at their rated speeds and beyond. This is actually true for most BGA memory.

2. The power regulation circuitry on this Gigabyte card has been redesigned with a cooler running and more stable triple-phase solution. That's one of the whole points of this card.

I would love to slap the Vf900 i have presently on this card when it comes out. The only thing keeping me from buying this card is how it performs to the OC'd version of the 8800 GT's (SSC editon, super duper nice edition w/e) Plus I can also put those RAM sinks to keep the modules cool if needed, that way it cud be the coolest air cooled 8800Gt around.
 
I hate reference design video cards that differ only in cooler sticker between brands. When a new card hits, I always wait a month or two until companies like GigaByte and others release "cool" and more refined versions of the same card. They usally negate the need for purchasing a 3rd-party cooler since they come stock with one and I like that they ditch the green PCB's. So frickin dull...

I hate buyers remorse, and seeing new updated revisions of video cards hit left and right, right after you had already purchased the john doe version is painful.

This is not new news. If you had researched before you purchased your "John Doe" version, you would have found sites with leaked pics of this Gigabyte card before the regular 8800GT was actually launched.
 
This is not new news. If you had researched before you purchased your "John Doe" version, you would have found sites with leaked pics of this Gigabyte card before the regular 8800GT was actually launched.

My video card is over a year old thanx, i didn't even buy an 8800GT :rolleyes:
Maybe you should re-read the post and acquire the hypothetical context of it.
 
I would love to slap the Vf900 i have presently on this card when it comes out. The only thing keeping me from buying this card is how it performs to the OC'd version of the 8800 GT's (SSC editon, super duper nice edition w/e) Plus I can also put those RAM sinks to keep the modules cool if needed, that way it cud be the coolest air cooled 8800Gt around.
I have a VF900 around here somewhere....will it fit the reference design 8800GT?
 
The 8800GT's reference cooler does not vent the hot air out of the back of the card. It exausts the air right back into the case.

You may be thinking of the GTX and GTS dual-slot coolers which DO exhaust hot air outside of the case.

Mine sure as hell doesn't. All of that air just shoots in to the case, can't feel a damn thing coming out the back on mine :confused:

The memory on the 8800gt is high enough and probably near its limit already, hence no more cooling.
 
Also, concerning the memory chips, note this thing has a "real" fan on it, the blowby is more than sufficient to cool them. Ramsinks applied with frag tape or relying on some other "stick on" newbie attachment, as opposed to something like AS adhesive would probally be worse than leaving them bare and exposed to the fan blowby.

I like the solid caps as I tend to keep video cards a long time.
 
Interesting read; I had always just assumed HDMI had a pair of audio pins. Thanks.

DVI and HDMI are essentially the same in their wiring and electrial designs, so the physical connector just needs a DVI-to-HDMI adaptor.

More detail here.
 
how is the gigabyte warranty compared to evga/xfx? i know it will be worse...but how much?

im watercolling btw if it makes a difference
 
I ordered a Gigabyte GV-NX88T512H-B from pcclub.com. (they still have stock at $269) It looks like the standard reference HSF. Maybe it will have the better voltage regulation like the GV-NX88T512H-P?
 
Back
Top