8800GTX 3DMark05 Result - 19114!

What's with the sudden attitude at [H]? Sorry I thought this deserved a bit of visibility......
 
Well for one this is old news, and also you could have posted this as a new topic and not both a reply and new topic. Just a bit annoying to see this twice within 10 seconds.

time for in game reviews
 
Old news, look at the date on the link...

First confirmed 3D05/06 runs, so totally not worthy :rolleyes:
 
Kick ass numbers, I'll take that setup off his hands in a heartbeat.

I guess I didn't realize that Kyle died and left Larkin the biz in his will.
 
man if you're going to play 3dmark, at least play 3dmark2k6!

i mean most games will be played with AA / HDR / AF or any combo of those 3, so why use 3dmark 2005 which represents, what 2 year old tech now :p ?
 
Digital Viper-X- said:
man if you're going to play 3dmark, at least play 3dmark2k6!

i mean most games will be played with AA / HDR / AF or any combo of those 3, so why use 3dmark 2005 which represents, what 2 year old tech now :p ?
His 3Dm06 score: 11314 at stock, 12645 with mild OC

Good enough for ya? Good enough for me!
 
Commander Suzdal said:
His 3Dm06 score: 11314 at stock, 12645 with mild OC

Good enough for ya? Good enough for me!

i know i saw that, but wondered why he was so excited about 2005 score
and yes
very excited i cant wait to play 2k6
 
Yes...interesting...

Now just imagine how much higher that score will be when nVidia releases actual/offical drivers for the 8800GTX!

Hmm anyone willing to bet 13 or 14K in 06?

Can't wait till November gets here and we start to see some real in game benches.
 
Digital Viper-X- said:
i know i saw that, but wondered why he was so excited about 2005 score
and yes
very excited i cant wait to play 2k6
Ah, gotcha, good point then...

I'm ready for my mid-life Crysis!
 
I'm going to play devil's advocate. The 3dmark05 score proves it's faster than a bat out of hell on all games including the ones we already own. If all we had were 3dmark06 and Crysis reviews, we'd have dozens of people screaming for performance benches on existing popular games.

So anyway, it's usefull info for getting a complete picture on the card's performance.
 
Advil said:
I'm going to play devil's advocate. The 3dmark05 score proves it's faster than a bat out of hell on all games including the ones we already own. If all we had were 3dmark06 and Crysis reviews, we'd have dozens of people screaming for performance benches on existing popular games.

So anyway, it's usefull info for getting a complete picture on the card's performance.

I agree with Advil...3Dmark is a great way of judging the overall performance of a card, and gives a good basis for what to expect in game.
 
Advil said:
I'm going to play devil's advocate. The 3dmark05 score proves it's faster than a bat out of hell on all games including the ones we already own. If all we had were 3dmark06 and Crysis reviews, we'd have dozens of people screaming for performance benches on existing popular games.

So anyway, it's usefull info for getting a complete picture on the card's performance.

what popular titles are there that 3dmark2005 still reflects :S? or even 2k6 seems pretty useless to guage a G80 for me at least, I imagine that we will run most games with some sort of AA or HDR on or both, so why would you want to know how well the card preforms without these features enabled,

I think with each archecture there is a different performance hit when different types of IQ enchancers are turned on ( AF / AA / ADAA / SSAA ), which is the real reason behind me saying, why would it matter if you get a score of how well a card performs without these features enabled, chances are you will be playing with them on, so you want to know how well it performs while using these features, rather then how well it performs playing a game which your previous 7900 card already played at 1600x1200 4xaa / 16xaf perfectly fine
 
Pretty hefty increase...but we won't need it for awhile unless you will die without having 200fps lol. I mean hell I plan on keeping my 7900GT for at least till next summer...
 
Thats a very overclocked CPU, reset it all to stock and see what happens. The X1950XTX pulls in 14k in 3dmark05...19k at extremely overclocked CPU is nothing to cheer about.
 
Endurancevm said:
Thats a very overclocked CPU, reset it all to stock and see what happens. The X1950XTX pulls in 14k in 3dmark05...19k at extremely overclocked CPU is nothing to cheer about.

lolz , that's right

i got 13k with my overclocked x1900xt,

ati will pwn nvidia , again
 
Endurancevm said:
Thats a very overclocked CPU, reset it all to stock and see what happens. The X1950XTX pulls in 14k in 3dmark05...19k at extremely overclocked CPU is nothing to cheer about.
Thats what I was thinking. If that guy does actually have the 8800, it shouldn't have any drivers out for it yet that are optimized for it, correct? It could hamper the performance level.
 
bobrownik said:
lolz , that's right

i got 13k with my overclocked x1900xt,

ati will pwn nvidia , again




*sniff sniff* what's that i smell oh yeah fan-b0y33 ch33se
:rolleyes:
 
Shoot me if I ever pose like a Sears model in a white Tux but great numbers
 
TheRapture said:
*sniff sniff* what's that i smell oh yeah fan-b0y33 ch33se
:rolleyes:


yeah im a ati !fan-b0y33 , because they make better hardware...

and someone has to go against payed nvidia "fan-b0y33s"
 
what about sioux's 16500 with a maxed out 1950xtx, and 4.45ghz cpu.
http://xtremesystems.org/forums/showthread.php?t=118488
compare that to 20900 with 3.9ghz cpu, and only a small core overclock. Not even the ram was overclocked. I could see 24k-25k with a maxed out card and a 4.4ghz cpu. That sounds like a 50% improvement.... with the current results there is a 26% difference.

An even better comparision is his 3.9ghz and maxed 1950xtx in 06. 8080 pts.vs 12600 mild gpu overclock. 55 % increase. FIFTY FIVE... (5*10) + 5.

This gpu has the balls, just admit it. I mean i t better do something considering all the power it draws. For god sakes it needs 2 pci-e connectors!! Aren't those like 110 watts each??? + the pci-e slot power. Its power ful and it will beat the crap out of a 1950xtx. Face it folks.
 
It doesnt matter what happens with the CPU or any overclocking. The difference is almost negligible in respect to how fast the 8800GTX theoretically should be.

We should wait for BOTH 8800GTX and R600 to come out and optimized drivers, then we can judge by REAL WORLD gaming performance, not be a synthetic benchmark.

There is no reason not to wait because the current batch of highend videocards are not being stressed except for some stupid games like Oblivion and COD2.
 
I don't imagine there would be a huge performance gap with current hardware in current games because of that, they are meant to be run on dx9, but im sure if 3dmark 2k7 does DX10 that all the X1950XTXs inthe world would score a big fat 0 :(
 
bobrownik said:
yeah im a ati !fan-b0y33 , because they make better hardware...

and someone has to go against payed nvidia "fan-b0y33s"

I like both, I have owned both....
 
VCAA - what does it stand for.. anybody know?

Very Correct Anti-Aliasing? LOL?!?
 
Endurancevm said:
There is no reason not to wait because the current batch of highend videocards are not being stressed except for some stupid games like Oblivion and COD2.

While I agree with your statement, I have to chime in an say that Call of Duty 2 is FAR from stupid.
 
Bo_Fox said:
VCAA - what does it stand for.. anybody know?

Very Correct Anti-Aliasing? LOL?!?

That's actually something I haven't seen speculated about. I mean, AA was NVIDIA's edge over ATI. ATI shined in AF with their angle-independent filtering, but NVIDIA shined with TSAA. If this rumored new AA mode is an improvement over TSAA, then we are set for a treat in IQ.

And I have no idea what it stands for :)
 
bobrownik said:
yeah im a ati !fan-b0y33 , because they make better hardware...

and someone has to go against payed nvidia "fan-b0y33s"

I really don't understand comments like these...
Both of them make excellent hardware. And the fact that they make it, is good for us. It's called competition, which makes the prices go down.
 
Dunno why people fuss about the two companies as well. Everyone has there favorite. I msyelf always favored Nvidia over ATI but I have never let that get in the way of purchasing the best video card my money could buy. The last ATI card i did purchase was the 9800pro with the R360 core. Im definitly going after the 8800GTX & if the top R600 cards just happen to better by a good margian, I would switch to that. Brand doesn't matter to me in the long run, performance does.
 
"Ahem...Ummm Victor? Jeeez I hate to bother you while you're trying on another tux...and I promise I wont touch the pleather on the camo SUV seats again...but could you post your full system specs... I sorta wanted to know what PSU you have in your systems..."

Yep... Good numbers... Hope the competition is fierce...and i really would like to know what PSU he's running on

EDIT>>> PSU = http://www.zippy.com/P_product_deta...S2/PS2+ single&pcpw_rfnbr=5&pp_code=PSL-6701P
 
wow nice...will it be able to run crysis at good res and high details?...

and a 700w psu with a super oc'ed conroe and a 8800gtx? didnt people speculate it needed its own psu?

well whatever that score is still amazing
 
Endurancevm said:
Thats a very overclocked CPU, reset it all to stock and see what happens. The X1950XTX pulls in 14k in 3dmark05...19k at extremely overclocked CPU is nothing to cheer about.

I was kinda thinking the same thing...
For that kind of money you could probably even get a Crossfire X1950 system (pending price drops once the G80 comes out, to stay competitive).

But then again, we know nothing until the G80 gets in the hands of [H] and they do a legit review on the retail version.
 
Arcygenical said:
I agree with Advil...3Dmark is a great way of judging the overall performance of a card, and gives a good basis for what to expect in game.

Pretty much the only thing it proves now days is how much better either company has optimzed their drivers for the specific benchmark.
 
menlatin said:
This gpu has the balls, just admit it. I mean i t better do something considering all the power it draws. For god sakes it needs 2 pci-e connectors!! Aren't those like 110 watts each??? + the pci-e slot power. Its power ful and it will beat the crap out of a 1950xtx. Face it folks.

Up to 150 watts, theoretically anyway. PCIe puts out only 75watts.
 
Back
Top