Morphes
Supreme [H]ardness
- Joined
- Jul 16, 2001
- Messages
- 4,337
how does a higher clocked card perform less than its more expensive counter part (6800gt) and why does the 6600gt have a 500mhz core and 6800gt have 350? very confuseing to me.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Morphes said:how does a higher clocked card perform less than its more expensive counter part (6800gt) and why does the 6600gt have a 500mhz core and 6800gt have 350? very confuseing to me.
Morphes said:ah okay, thanks, i wonder if there is anyway to flash the bios to enable the extra pipes which would make it a killer deal
wolf6162001 said:hey..every one here in tread. I bought a bfg 6800 ultra and I have a 480 watts antec..do I have to upgrade my PSU? to 500 watts so I can have a 480 watts extact. you guys know if 480 watts is a 460 watts right?
it's a 939 fx amd
value ram 1 gig
2 hard drive.
2 cd rom
I need help..
Why? Thats like going back to using 3DMarks, lol.XamediX said:Maybe [H]ardocp can make an update using the source beta benchmarks....
I know, even still.... jus a peak of whats to come. This would be important as this could be the last thing to think about for mainstream (~$200) buyers. 6600 vs 9800XT(pro) on an engine supposedly favoring aTi.CrimandEvil said:Why? Thats like going back to using 3DMarks, lol.
We won't know how the game plays until it actually ships.
XamediX said:I know, even still.... jus a peak of whats to come. This would be important as this could be the last thing to think about for mainstream (~$200) buyers. 6600 vs 9800XT(pro) on an engine supposedly favoring aTi.
LOL like in this Open GL game.rancor said:ATi's current tech just stinks with Ogl. There are benchmarks that show ya its only when AA and AF are on ATi gets a marginal lead. .
No idea what you mean by that since you could always just get a PCIe card and take those out.thylantyr said:Two 6600's SLI = $400 = cool for right now, but your videocard lifespan is drastically reduced as there is no more upgrade path with these two cards.
Why isn't it like the Doom 3 engine? It's a game engine, just... different, and Direct 3D based. All the CS Source engine tests and stress tests floating around the web point towards ATI having a slight lead in HL, just like they had a slight lead in Far Cry.rancor said:Highly unlikely the engine favors ATi, Valve wouldn't alienate half the gaming community. This isn't like the Doom 3 engine. ATi's current tech just stinks with Ogl. There are benchmarks that show ya its only when AA and AF are on ATi gets a marginal lead. In the real game its going to be cpu limited anyways, so its not that big of a deal. The engine running on a Fx-53 probably won't go over 40-50 fps anyways.
Impulse said:Why isn't it like the Doom 3 engine? It's a game engine, just... different, and Direct 3D based. All the CS Source engine tests and stress tests floating around the web point towards ATI having a slight lead in HL, just like they had a slight lead in Far Cry.
Is it as big as the lead 6800 series cards get in Doom 3? Doesn't seem like it, in fact the 6800 series remained over 50 fps on most of the Source Stress Tests I last read (which was not the case for X800s in Doom 3), regardless like others said... Gotta wait for the full game to tell for sure, maybe there's optimizations left to be made.
Either way, it's hardly like Valve is "favoring" one of the two companies or purposely allienating customers that brought a certain brand, what card their engine runs better on is simply a matter of drivers and where the hardware stands today (the engine's been in development for a long ass time after all).
Finally, w/ATI cards the stress test did yield over 40-50 fps often, either way it seemed to be a far less stressful engine than the Doom 3 tho (judging merely by the fps scores overall, which were higher than Doom 3's, maybe it's just a more efficient engine).
lordroy said:Has anyone noticed the trend of AMD/Nvidia -vs- Intel/ATI? Or is it just me?
-R
lordroy said:Has anyone noticed the trend of AMD/Nvidia -vs- Intel/ATI? Or is it just me?
-R
lol. I feel particularly sorry for the people who paid $500 for a 9800XT this time last year only to have it get beaten out by a card that costs 40% as much a year later .chrisf6969 said:Sucks for all thos 5700, 9600xt, 9800, 5900U, 5950 owners out there. Your cards are now worthless. j/k! not worthless, but much less when you can buy a brand new shiney 6600GT for $200 with SM3.0, etc.
GVX said:lol. I feel particularly sorry for the people who paid $500 for a 9800XT this time last year only to have it get beaten out by a card that costs 40% as much a year later .
I'm sure the 6600GT will get even cheaper after its been out for a while (that would make it even more worth picking up).
How about those that pre ordered and got 5800 Ultras? LOLGrinder66 said:Does no one feel sorry for us saps that paid $300 and $400 respectively for the GF3 and GF4 TI4600 a few years back? I wont mention the voodoo 1,2......and all the other voodoo cards I bought way back when. FFS.
Grinder66 said:Does no one feel sorry for us saps that paid $300 and $400 respectively for the GF3 and GF4 TI4600
stelleg151 said:So I am as excited as everyone about this card, just waiting for these slowpoke mobo manufacturers to catch up. I do have one question- does anyone else notice the aweful image quality in the far left picture in the review for Farcry?? Its aweful, the tiles are all pixely and there are really grainy edges around some of the edges of the wall. All other pictures looked ok, but this one scares me, could it be a shader code scaled to less quality for fps?? Just curious, but other than that the card is sweeeet, and I also cant wait for sli performance.
Thanks