G8800 Series 3DMark06 Score

annaconda

[H]F Junkie
Joined
Apr 13, 2005
Messages
9,925
Link


"The upcoming GF8800 line will hit just under 12,000 in 3DMark06, with a new set of drivers due before the non-denominational year-end festive season, pushing it to just over 12K.

Not quite enough to beat R600 from what I hear though. µ
"
 
Yeah, its pretty good, but not as much as some expectactions. It seems that it is not a whole lot faster than the GX2.

I get about 5500 with "high quality" settings and around 7700 (if I remember correctly) with "high performance" settings on my single 512MB X1900XTX.

3DMark scores mean almost nothing if the quality settings are not listed. It may be right to assume that most use the "high performance" setting in order to inflate scores.
 
As they mentioned it is not good as compare to R600 expectations. But i think it is very good as compare to todays cards.

I don;t see any setup SLI or CROSSFIRE who is able to break 10000 Mark in 3DMark06. So if a single G80 can do 12,000 that is more than good or i should say excellent.

Correct me if i am wrong?
 
I think the ATI rumor mill is grossly exaggerating, In the end im willing to bet the r600 will be pretty damn close to g80.

12k is pretty damn nice if they would just release the damn things i could build my new rig and enjoy one.
 
The top 20 3Dmark 06 scorers are all in the 12000's. That's really not that bad, considering lots of these people have SLI'd 7900GTXs under extreme volt and core mods (like 700 core +).

So yeah, one card can do as well as 2 heavily modded GTX's...

Those GTXs score, what, 5000 stock? 6500 overclocked?

Considering that these cards will probably overclock to 14000 3D06 marks... it's twice the performance of the GTX, or about the same performance in SLI (but for half the cost).

Either way, it certainly beats my score of about 4800 :p.

Looks nice to me!!
 
These G80 scores are excellent, from my point of view.

This is a big jump from the last generation cards. We never had that of a big performace jump almost more than twice. :D

I don;t think so either that R600 would be much faster than this. How in the hell NVIDIA will let that happen, they have inside resources they know what they are doing, so who ever can afford this puppy at the launch time buy it and enjoy. :cool:
 
annaconda said:
These G80 scores are excellent, from my point of view.

This is a big jump from the last generation cards. We never had that of a big performace jump almost more than twice. :D

I don;t think so either that R600 would be much faster than this. How in the hell NVIDIA will let that happen, they have inside resources they know what they are doing, so who ever can afford this puppy at the launch time buy it and enjoy. :cool:


Will Do! :p
 
synthetic benchmark doesn't equate to real world gaming...

It could just be that G80 is better optimized for 3dmark 2006...

Just like radeon 1600pro gets decent scores with 3dmark 2006 but gets hammered by x850xt in real world gaming benchmark.
 
Scyles said:
3DMark scores mean almost nothing if the quality settings are not listed. It may be right to assume that most use the "high performance" setting in order to inflate scores.
Why would this be a more correct assumption? It seems vastly more likely that the standard Quality settings were used to achieve 12K performance, as that is the default quality level and is almost without question the most widely used mode by gamers, reviewers (because that's what nVidia recommends that reviewers use) and hardcore benchmarkers.

Marvelous said:
synthetic benchmark doesn't equate to real world gaming...
The two aren't equatable, but there's a significant correlation, especially with 3DMark06.
 
So who is going to play 3dmark anytime soon?

I for one dont see how a synthetic benchmark will account to real world gaming.

I am waiting up until real world gaming tests, Crysis for example would be a good benchmark.
 
Better than current gen yes, but people are saying it should be even more of a performance boost compared to current gen? Aren't we back to that whole limitations of dx9 and how much data it can buffer, etc..... blah blah blah.....

??
 
fuck 3dmark ATi has historlicly had higher scores
lets see what [H] has to say about the all this
 
3Dmark06 is basically a formalized and standardized game time demo. Unlike most games, it runs through ALL the aspects of a cards performance. Game timedemos are not a reliable test,either. I think the real problem is that AA is not enabled by default. They should really require 4xAA as a standard.
 
I'll agree that 3Dmark series of benchmarks are generally useless...

But this card is beating the next best (single) card by almost 2x!

Who cares if it's synthetic, it's still a HUGE margin.
 
Arcygenical said:
I'll agree that 3Dmark series of benchmarks are generally useless...

But this card is beating the next best (single) card by almost 2x!

Who cares if it's synthetic, it's still a HUGE margin.

I think everyones blowing it out of proportion, 3DMark has not even been run on these cards yet its simply speculation and marketing mumbo jumbo.

As the quote said "GeForce 8800 line to hit 12K in 3DMark" they didnt say it did but speculated it will.

Lets just be patient and see what happens it may hit that mark and it may not.
 
I don't care about 3DMark06, but I would love to see the SM2.0 and SM3.0 scores instead of a score including a cpu.
 
annaconda said:
As they mentioned it is not good as compare to R600 expectations. But i think it is very good as compare to todays cards.

I don;t see any setup SLI or CROSSFIRE who is able to break 10000 Mark in 3DMark06. So if a single G80 can do 12,000 that is more than good or i should say excellent.

Correct me if i am wrong?

Not wrong, because this is what was expected. G80 is basically as fast or faster than two of the current fastest cards in SLI/Crossfire.
 
12,000 is a very very good score in my opinion, considering that's right out of the box.
However, we should all wait to pass judgement until we get a [H] review titled "Real World G80 vs. R600"
 
One problem with 3DMark06's overall score is that it takes the CPU performance into consideration for the overall result, therefore you are always CPU limited and bound by that for the overall score result.

To find gaming performance though you have to use games, period.
 
Brent_Justice said:
One problem with 3DMark06's overall score is that it takes the CPU performance into consideration for the overall result, therefore you are always CPU limited and bound by that for the overall score result.

To find gaming performance though you have to use games, period.
Which I'm guessing you are doing right now?:D I mean, like you would farm this one out to a member of your review team... yeah right!
 
3DMark06 is more relevant today than ever. Since the recent advent of powerful video cards, the GPU is no longer the bottleneck. It is the CPU. So, if you want to measure performance, the CPU needs to be considered. But, the CPU numbers generated only play into the score by an arbitrary amount (3DMark Score = 2.5 x 1.0/(( 1.7/GS + 0.3/CPU Score )/2).

And, appropriately, a Conroe based system barely outruns an AMD based system in this benchmark. If you look at Aquamark3 (the worst of the synth benches), 3Dmark05 or 3Dmark06, the Conroe walks the AMD scores by a country mile. In games, however, that margin is whole lot smaller which, by no coincidence, is similar to the ratios on 3DMark06. So, getting 12K in 3DMark06 with a single card is no small feat and does show the actual in game performance that can be had with this new card.

Personally, I am not waiting 3 plus months for a R600 versus G80 comparison if this card is available in the next few weeks. I sold my 7950 GX2 cards for a decent amount and have a couple of GTO cards in hand as an interim solution (which runs just under 10K with HQ settings in 3DMark06, same as a single 7950GX2, BTW).
 
HeavyH20 said:
3DMark06 is more relevant today than ever. Since the recent advent of powerful video cards, the GPU is no longer the bottleneck. It is the CPU. So, if you want to measure performance, the CPU needs to be considered.
I think you've got that backwards. The CPU hasn't been much of a factor lately (as long as you've got a recent vintage CPU). Games have been GPU limited for some time now. Only with the G80 and R600 are we possibly looking at an era where CPU upgrades will become meaningful again.

CPUs & Real-World Gameplay Scaling
 
3DMark is an e-penis measurement :) And it in theory allows you to compare different videocards, however I prefer game tests to synthetic ones. The one use for 3DMark06 that I know of is finding bad 7900 series cards w/Deep Freeze...
 
Brent_Justice said:
Games are more relevant.


If you test it the way the end user plays, yes. Otherwise, once review versus the next caries wildly in bench results on the same game. So, it is all relative and 3DMark06 is relative, as well. So, a non-standard time demo on site A says 45 fps and another says 58 fps. But, they both get the same score on 3DMark06. Time demo are synth tests, as well and are even less relevant since there is a ZERO standard.

And, E-penis scores are determined by game demos, as well. I think it is a mistake to leave out 3DMark06 video scores altogether. They help me decide on a relevant path. Probably why I skip the video reviews here.

CPUs & Real-World Gameplay Scaling[/QUOTE]


Read that. Looks to me that the CPU does help.
 
Depends on the game, really. In, WoW, yes, but probably not in FEAR or Oblivion. Besides by the time you're looking at a CPU bottleneck, every thing's maxed out and the fps is so high, you don't care anymore.
 
I definitely disagree with you here Brent. You're right -- you can't determine gaming performance with 3DMark, but can you not determine rendering performance with it? 3DMark06 has a far more "raw numbers" appeal to me, even though I don't directly equate it with gaming performance. As I said before, the correlation, however, between gaming performance and 3DMark performance is significant. We've seen a few odd things happening here and there, where some architectures seem to favor 3DMark, even though real-world gaming performance is not as impressive, but, on the whole, the correlation is always present, and we can use 3DMark as a sort of preliminary gauging tool in determining improvement in one graphics card. We just tend to hope that we have more to look at than 3DMark scores.

In all fairness, I see real-world FRAPS runs being far less representative of a card's theoretical performance than anything else. Not only are you factoring in the CPU to a far more significant degree, but you're also adding other subsystems, such as the sound card and the hard disk. One could say "who cares about theoretical performance", but theoretical performance tells us more about how a card will fare in other/future titles, while FRAPS runs tell us how a card fares in one game, at these settings, on this platform, at this very particular point in time. The [H] testing methodology does not determine gaming performance: it determines These Games We've Tested gaming performance.

If 12,000 is the number for the 8800 GTX, that's a good number, and we should expect gaming performance to follow suit, should we not?
 
phide said:
If 12,000 is the number for the 8800 GTX, that's a good number, and we should expect gaming performance to follow suit, should we not?

No, we should not. Games behave differently than 3DMark.

If you use the 3DMark results to decide how well a video card performs I feel sorry for you. 3DMark performance does not equal game performance.

3DMark is good for watching the pretty demos and using it to test stability and the multitexture test to make sure your core overclock is working, that is all.

It cannot tell you how Crysis will perform or look with a certain video card, only Crysis can tell you that.
 
I never attempted to debate the fact that 3DMark is not a game and its results do not always directly correlate with what is witnessed with gaming performance, but you don't feel as if there is any correlation at all between 3DMark scores and video card gaming performance? If there's one thing that I've learned from fooling around with 3DMark over the years, it's that it responds applicably to changes in hardware. A GPU overclock nets gains, a CPU overclock nets gains, and hardware upgrades net gains. This is the exact same scenario with games. Overclocks and upgrades yield better game performance. The gains may be minimal, especially in the "real-world gaming" type of scenario, where bottlenecks are a dime a dozen (and these bottlenecks are a substantial consequence of the [H] methodology), but performance increases are still reflected with both testing methods (3DMark/timedemos and FRAPS runs).

You would most certainly use Crysis to determine how Crysis performs on a particular card, but what does that tell you about how one card's performance translates to another game? If the mindset is that 3DMark06 tells you only how 3DMark06 performs, the same mindset would apply to Crysis. Benchmarking with Crysis tells you only how Crysis performs, so the data observed does not correlate to any other game, even those that are also included in your testing gamut.

My mindset is that 3DMark and Crysis performance can be useful guides in determining overall performance of a card. I make assumptions based on a variety of data.

Certainly don't feel sorry for me, Brent -- 3DMark is not, and has never been, the sole determining factor in my video card purchases, nor do I weight it significantly in my decision making process.
 
Could we not argue yet again about 3DMark vs "real world" testing? :eek: :eek:

They both have their uses...

12000 in 3DMark 2006 means as much to me as some Crysis results.
 
http://www.theinquirer.net/default.aspx?article=35258

WANT HARD NUMBERS on G80? Want specs? Well, you can find the specs here at PC Welt, but they are in German. As for numbers, we told you that they would be at about 12,000 give or take a driver revision.

That brings up the question of 'on what'? Those numbers are on a Kentsfield, how will it do on your CPU?

Well, a Conroe 2.93 will score about 10,500, and a FX-62 will hit almost 10,000 on the nose. It is interesting that the quad will hit notably higher with a lower clock but a higher bus. I wonder if it is the threading or the FSB giving the kick? µ
 
mountain.jpg
 
razor1 said:
Thats BS, changing CPU's the 3dmark06 scores don't change by the CPU score amounts, they use some formula that gives more importance to the graphics scores. Its pretty obvious the Inq is lieing about the numbers, not to mention now Charlie's is involved with spreading FUD along with Faud.

Changing CPUs does change the score sometimes by quite a bit.

Run 3dmark with your cpu at stock then OC and run again to see the difference.
 
Back
Top