Promised review is up! NVIDIA GeForce 6600GT Review!!!

how does a higher clocked card perform less than its more expensive counter part (6800gt) and why does the 6600gt have a 500mhz core and 6800gt have 350? very confuseing to me.
 
Morphes said:
how does a higher clocked card perform less than its more expensive counter part (6800gt) and why does the 6600gt have a 500mhz core and 6800gt have 350? very confuseing to me.

the 6800GT has 16 pixel pipelines at 350MHz which = 5.6 Gpixels/s fillrate and 6 vertex engines

the 6600GT has 8 pixel pipelines at 500MHz which = 4 Gpixels/s fillrate and 3 vertex engines
 
ah okay, thanks, i wonder if there is anyway to flash the bios to enable the extra pipes which would make it a killer deal
 
Morphes said:
ah okay, thanks, i wonder if there is anyway to flash the bios to enable the extra pipes which would make it a killer deal

LOL, no the 6600GT has 8 pipes on the core... and all are enabled already. You cant enable something that isnt there!

only possible on cards like the old 9500's, current X800's and possibly on the 6800 vanilla's.
 
ooooh so thats what the NV43 is about, i get ya. lol i feel stupid anyway live and learn right?
 
lol np there, yeah making the 6800 cores gets a bit expensive for the mid range cards so they made an new fab for em :)
 
The 6600GT only has a 128MB of RAM on a 128-bit memory interface also. That's holding it back at the high end and will probably keep it well below the 6800GT in future games.
 
Finally, a card at a reasonable price that packs performance that is enough to make me feel *justified* in upgrading. Because frankly, I don't think any ~$200 card up until now has delivered significantly more than the GF4Ti. And it'll be a cold day before I spend $300 on a piece of hardware.
 
hey..every one here in tread. I bought a bfg 6800 ultra and I have a 480 watts antec..do I have to upgrade my PSU? to 500 watts so I can have a 480 watts extact. you guys know if 480 watts is a 460 watts right?

it's a 939 fx amd
value ram 1 gig
2 hard drive.
2 cd rom

I need help..
 
and I'm planing to hook up the 6800 ultra both molex and each side by them self meaning will not share it to any harddrive. or cd.
 
wolf6162001 said:
hey..every one here in tread. I bought a bfg 6800 ultra and I have a 480 watts antec..do I have to upgrade my PSU? to 500 watts so I can have a 480 watts extact. you guys know if 480 watts is a 460 watts right?

it's a 939 fx amd
value ram 1 gig
2 hard drive.
2 cd rom

I need help..

you would probably get a better answer by posting a new thread, or at least finding a similar thread.

anyway, your 480w PSU will probably work just fine. i'm not sure what you mean when you say: "you guys know if 480 watts is a 460 watts right?"

nVidia recommends a 480w PSU simply because some people own really cheap PSUs. a quality PSU like your Antec will do just fine.
 
hmmm, I wonder what this means to the lucky few of us that paid $223 for a newegg 9800pro 256mb that flashed to XT. I guess i'll keep it, HL2 Might run better on it. Maybe [H]ardocp can make an update using the source beta benchmarks....
 
XamediX said:
Maybe [H]ardocp can make an update using the source beta benchmarks....
Why? Thats like going back to using 3DMarks, lol. :rolleyes:
We won't know how the game plays until it actually ships.
 
CrimandEvil said:
Why? Thats like going back to using 3DMarks, lol. :rolleyes:
We won't know how the game plays until it actually ships.
I know, even still.... jus a peak of whats to come. This would be important as this could be the last thing to think about for mainstream (~$200) buyers. 6600 vs 9800XT(pro) on an engine supposedly favoring aTi.
 
You have to remember that the 9800 Pro is running on last gen technology...and that the 6600GT will have a huge advantage b/c of its .11micron >> which give the 6600GT's ridiculously high clocks for a mainstream card.

The bottom line is that the 6600GT is true next gen (not FX-series junk). I think we can expect great things. Since it runs so cool, i'm wondering how or what Arctic Cooling can do with this chip...

I'd much rather have a 6600GT than even a 6800NU, much less a GT heating up my case.

I was about to jump on a leadtek 6800NU, but since these benches came out, I'm thinking this one through ($100 saved gives me money for a memory upgrade =D).
 
XamediX said:
I know, even still.... jus a peak of whats to come. This would be important as this could be the last thing to think about for mainstream (~$200) buyers. 6600 vs 9800XT(pro) on an engine supposedly favoring aTi.


Highly unlikely the engine favors ATi, Valve wouldn't alienate half the gaming community. This isn't like the Doom 3 engine. ATi's current tech just stinks with Ogl. There are benchmarks that show ya its only when AA and AF are on ATi gets a marginal lead. In the real game its going to be cpu limited anyways, so its not that big of a deal. The engine running on a Fx-53 probably won't go over 40-50 fps anyways.

3dc was already in the VST so that could help out bandwidth usage while using AA and AF for ATi which shows the pronounced lead. But then again we also know about ATi's aa and af algos.
 
This is going to be a card gamers will have for YEARS to come, just like with the Ti 4200 :) Gotta love it!
 
/guessing.......

Two 6600's SLI = $400 = cool for right now, but your videocard lifespan
is drastically reduced as there is no more upgrade path with these
two cards.

Or....

Buy a single 6800GT or U for $400 now, enjoy current games even
if the dual 6600 scores a we bit higher.

Later, when next gen games are more demanding and when 6800GT or
U drops in price to $200, then do the SLI on those to 'double' your
scores, probably in a couple years...

Speculating -> Dual 6800 SLI > Single [future] Geforce 7 in raw FPS, but
maybe not > in feature set that you won't use anyways because games
don't support GF7 featuresets upon release.

Wildcard -> IF ATI makes a huge leap on the next generation? If so,
then ditch your 6800 card, do not SLI and get the [future] next gen ATI.

/haha

Cat and mouse
 
thylantyr said:
Two 6600's SLI = $400 = cool for right now, but your videocard lifespan is drastically reduced as there is no more upgrade path with these two cards.
No idea what you mean by that since you could always just get a PCIe card and take those out.
 
He's still got a point about getting a GT now, waiting for GF7, then SLIing it so you can skip a generation or two.
 
rancor said:
Highly unlikely the engine favors ATi, Valve wouldn't alienate half the gaming community. This isn't like the Doom 3 engine. ATi's current tech just stinks with Ogl. There are benchmarks that show ya its only when AA and AF are on ATi gets a marginal lead. In the real game its going to be cpu limited anyways, so its not that big of a deal. The engine running on a Fx-53 probably won't go over 40-50 fps anyways.
Why isn't it like the Doom 3 engine? It's a game engine, just... different, and Direct 3D based. All the CS Source engine tests and stress tests floating around the web point towards ATI having a slight lead in HL, just like they had a slight lead in Far Cry.

Is it as big as the lead 6800 series cards get in Doom 3? Doesn't seem like it, in fact the 6800 series remained over 50 fps on most of the Source Stress Tests I last read (which was not the case for X800s in Doom 3), regardless like others said... Gotta wait for the full game to tell for sure, maybe there's optimizations left to be made.

Either way, it's hardly like Valve is "favoring" one of the two companies or purposely allienating customers that brought a certain brand, what card their engine runs better on is simply a matter of drivers and where the hardware stands today (the engine's been in development for a long ass time after all).

Finally, w/ATI cards the stress test did yield over 40-50 fps often, either way it seemed to be a far less stressful engine than the Doom 3 tho (judging merely by the fps scores overall, which were higher than Doom 3's, maybe it's just a more efficient engine).
 
Why Doom3 is such a big deal: ?

Look at how many games (and mods) were based on the Quake engines. (especially Quake3)!! Anyone have a number?

My fav game still is SOF2 - based on Q3.

Doom3 engine is already licensed to other companies working on other games, and will probably be the basis of MANY games over the next few years, just like Quake3 was used by so many for so long.
 
Impulse said:
Why isn't it like the Doom 3 engine? It's a game engine, just... different, and Direct 3D based. All the CS Source engine tests and stress tests floating around the web point towards ATI having a slight lead in HL, just like they had a slight lead in Far Cry.

Is it as big as the lead 6800 series cards get in Doom 3? Doesn't seem like it, in fact the 6800 series remained over 50 fps on most of the Source Stress Tests I last read (which was not the case for X800s in Doom 3), regardless like others said... Gotta wait for the full game to tell for sure, maybe there's optimizations left to be made.

Either way, it's hardly like Valve is "favoring" one of the two companies or purposely allienating customers that brought a certain brand, what card their engine runs better on is simply a matter of drivers and where the hardware stands today (the engine's been in development for a long ass time after all).

Finally, w/ATI cards the stress test did yield over 40-50 fps often, either way it seemed to be a far less stressful engine than the Doom 3 tho (judging merely by the fps scores overall, which were higher than Doom 3's, maybe it's just a more efficient engine).

well yes it did, and also it as well over 40-50 on nV cards too, but it doesn't utilize any physics or ai, personally speaking the Havok engine is a cpu killer, just as HL 2 's ai probably is too.

The VST is pretty much a sythetic benchmark.

There are special optimizations just for ATi's cards, when is the lead only pernounced with aa and af for ATi's cards? Much more so then when AA and AF are off? Interesting isn't it. It could be due to 3dc but I wouldn't expect that much of difference. Maybe since the engine was build on ATi cards the engine is more opitimized for ATi's driver sets. nV is probably already working on tweaking thier drivers to improve performance on the Source engine. If you took a look at early Tomb Raider benchmarks, Crytek benchmarks, the GF 6 line had close to a 100% increase on both engines.

It isn't like the Doom 3 engine because the gf6 line doesn't stink at d3d :D unlike ATi, nV didn't forget about one of the API's.

http://www.xbitlabs.com/articles/video/display/counterstrike-source_7.html

Here is a good example, as resolution goes up nV is getting hit harder, reason being increased bandwidth. But it should be fairly proportionate to ATi's loss too but it isn't why is that? In many other game benchmarks nV has shown they did better with higher resolutions.

http://www.xbitlabs.com/articles/video/display/graphics-cards-2004.html

The only time when ATi scores is when both aa and af are on.

Interesting isn't it?

Why didn't that repeat in the VST? And why shouldn't it repeat?

And was the VST that demanding on bandwidth that it would slow down the cards? Not really total assets in that test probably didn't even come up to 150 mb.
 
Can the reviewers comment on how the system overclocks with the 6600GT?
I recall in previous reviews on the web that higher FSB speeds on the 925X chipset were attainable using ATI native PCI express cards. The bridged NVIDIA cards were said to be less tolerant of out of spec PCI express frequencies. Has this changed with the 6600GT now that it is a native PCI express card?

Thanks
 
So I guess the big question is:

When will we see a AMD 939 dual PCIe board that can handle a pair of these bad ass mainstream cards.


Has anyone noticed the trend of AMD/Nvidia -vs- Intel/ATI? Or is it just me?

-R
 
lordroy said:
Has anyone noticed the trend of AMD/Nvidia -vs- Intel/ATI? Or is it just me?

-R

I think Intel is scared to let Nvidia make a motherboard for their CPU's b/c it would kill Intel's motherboard/chipset sales when Nvidia made a better motherboard than theirs. So Nv has to be content with an AMD only stable.
 
Obviously, all that NVidia would provide Intel if they were allowed to make P4 mobo chipsets is competition... Hell Intel can't even stand VIA and often indulges in patent disputes with them despite VIA not having that large a share of the market for motherboard chipsets designed for Intel processors.

AMD would've done the same thing if their development of new motherboard chipsets for their processors had the years of experience and efficiency Intel's does, but they preffered to stray out of that market a couple years ago.

I don't think either company is trying to "get itself in bed" so to speak with either of the video card GPU manufacturers, it's just the way the cookie's crumbled. NVidia grew as a company and expanded into the motherboard market, there was a void they could fill and so they did. Intel's tried to enter the video card market as well with lackluster success, instead relegating itself to budget on-board solutions.

The only trend is NVidia expanding it's reach as a company, ATI and AMD sticking to what they do best, and Intel being Intel. :p
 
haha this could be all fate.

nVidia releases underpowered/crappy FX5800 series...but at the same time put out the BEST Socket A chipset, the nForce2. I'm wondering if that is such a coincidence?

But yeah, i guess its a good thing AMD dropped out. But if nVidia never came in, i'd still be getting a heart attack with every 4in1 drivers install.

A7V anyone? 686 southbridge, PCI latency issues...jeeez. Again, so nice that nVidia is a legitimate alternative.
 
chrisf6969 said:
Sucks for all thos 5700, 9600xt, 9800, 5900U, 5950 owners out there. Your cards are now worthless. j/k! not worthless, but much less when you can buy a brand new shiney 6600GT for $200 with SM3.0, etc.
lol. I feel particularly sorry for the people who paid $500 for a 9800XT this time last year only to have it get beaten out by a card that costs 40% as much a year later :D .

I'm sure the 6600GT will get even cheaper after its been out for a while (that would make it even more worth picking up).
 
GVX said:
lol. I feel particularly sorry for the people who paid $500 for a 9800XT this time last year only to have it get beaten out by a card that costs 40% as much a year later :D .

I'm sure the 6600GT will get even cheaper after its been out for a while (that would make it even more worth picking up).


Does no one feel sorry for us saps that paid $300 and $400 respectively for the GF3 and GF4 TI4600 a few years back? I wont mention the voodoo 1,2......and all the other voodoo cards I bought way back when. FFS. :rolleyes:

If you enjoy games no matter what kind of games you play, you are going to look for the best possible performance in the price range that you can afford.

I shudder to think of all the hard earned cash I pissed away on these damn video cards over the years.... It probably would have been cheaper to pick up a healthy cocaine habit. The fun games I played online and otherwise during my video card buying spree kinda makes it all worth it though.

Its a never ending sad story for us game junkies people, can we quit looking at the past and look toward the (hopefully affordable) future???

....but I digress....


/end rant
 
Grinder66 said:
Does no one feel sorry for us saps that paid $300 and $400 respectively for the GF3 and GF4 TI4600 a few years back? I wont mention the voodoo 1,2......and all the other voodoo cards I bought way back when. FFS. :rolleyes:
How about those that pre ordered and got 5800 Ultras? LOL
 
So I am as excited as everyone about this card, just waiting for these slowpoke mobo manufacturers to catch up. I do have one question- does anyone else notice the aweful image quality in the far left picture in the review for Farcry?? Its aweful, the tiles are all pixely and there are really grainy edges around some of the edges of the wall. All other pictures looked ok, but this one scares me, could it be a shader code scaled to less quality for fps?? Just curious, but other than that the card is sweeeet, and I also cant wait for sli performance.

Thanks
 
Noticed that too, it was that way with the FX cards and the 6800 series too. Good thing I don't play FC though. ;)
Other then FC and Madden it looks just like ATi's cards. I'm thinking that it's probably some rendering issue with those games and those cards. :confused:
 
Yeah, its sad though because FC is such a beautiful game, maybe with shader 3.0 those problems dissapear, because for the test they were using version 1.1 rather than 1.2.
 
man Nvidia is just OWNING ATI this round! really NO debating it :) but the Gf4 were doing this when the 9700 came out so let see what ATI does with the R500 or whatever its due out sometime next year and might be DX10? you know how rumors go :rolleyes: but it will be ATI's first really new architecture since the 9700pro Nvidia on the other hand has said that the current set of vid cards will only be getting refreashed for the next 18-24 months.....to me that says in 8-10 months ATI will have lead again untill 8-16months after that Nvidia comes out with new architecture and so on.... basically since both companies are releaseing there new engines so far apart the thrown of "best vid card company" will be traded back and forth every 12 months or so
 
Grinder66 said:
Does no one feel sorry for us saps that paid $300 and $400 respectively for the GF3 and GF4 TI4600

I paid $349.99+tax for a Hercules GeForce2 GTS 64MB video card.

Seems incredible? It was the newest card at the time... most people were running TNT2 or Voodoo2 SLI at the time.

I love technology. :)
 
stelleg151 said:
So I am as excited as everyone about this card, just waiting for these slowpoke mobo manufacturers to catch up. I do have one question- does anyone else notice the aweful image quality in the far left picture in the review for Farcry?? Its aweful, the tiles are all pixely and there are really grainy edges around some of the edges of the wall. All other pictures looked ok, but this one scares me, could it be a shader code scaled to less quality for fps?? Just curious, but other than that the card is sweeeet, and I also cant wait for sli performance.

Thanks

its a bug with patch 1.1 of farcry on gffx and 6800 cards

it will be fixed in patch 1.2
 
Back
Top