3dmark05 scores from Inquirer

btf

Limp Gawd
Joined
Aug 11, 2004
Messages
427
http://www.theinquirer.net/?article=18734

3Dmark 05 scores revealed

Before launch


By Fuad Abazovic in Wien: Tuesday 28 September 2004, 19:22

WE HAVE SOME numbers that we would like to share with you before Futuremark finally introduces this long awaited test.
With current drivers, Nvidia will score 5000 points with the 6800 Ultra while the X800XT PR will score only 4500 with 4.9 drivers. With the new ATI driver codenamed 8.07, ATI score will jump to an incredible 5800. All these scores are on a 3.4GHz Pentium 4 machine. The ATI driver will be introduced as the Catalyst 4.10 and we learned it should get its WHQL very soon.

The Geforce FX 5950 Ultra will score a modest 1270 while the Radeon 9600XT will provide 1570 marks. The mainstream card kicks the former high end in its performance nuts.

The Geforce 6800 GT will score 4500.

You will be able to see many more scores on the 29th of this month at 3PM CET when Futuremark lifts the non disclosure agreement (NDA) veil. µ
 
while the X800XT PR will score only 4500 with 4.9 drivers. With the new ATI driver codenamed 8.07, ATI score will jump to an incredible 5800.

Uhhhh....I find that a *little* difficult to believe. In fact, quite impossible. Worm (one of the Futuremark staff) has certainly implied that the 6800s are faster in 3dMark05 due to PS3.0 being used in every test as a method of accelerating the rendering. The X800s don't have that. (Granted, his comments were along the lines of "It doesn't give as much of a lead as you'd think", but I'd hardly take that to mean "It's way the hell behind ATI's score").

Especially given that 3dMark05 isn't using 3dc.

Either TheInq is just spouting BS, or ATI is doing something VERY fishy in their new drivers.
 
dderidex said:
Uhhhh....I find that a *little* difficult to believe. In fact, quite impossible. Worm (one of the Futuremark staff) has certainly implied that the 6800s are faster in 3dMark05 due to PS3.0 being used in every test as a method of accelerating the rendering. The X800s don't have that. (Granted, his comments were along the lines of "It doesn't give as much of a lead as you'd think", but I'd hardly take that to mean "It's way the hell behind ATI's score").

Especially given that 3dMark05 isn't using 3dc.

Either TheInq is just spouting BS, or ATI is doing something VERY fishy in their new drivers.


Weren't thier new drivers not supposed to effect Synthetic benchmarks? :eek:

Wasn't Faud the one that always seems to get wacky news anyways?
 
Well now that those numbers have been revealed, it saves us all the time of benchmarking and finding we get the same scores
 
dderidex said:
Uhhhh....I find that a *little* difficult to believe. In fact, quite impossible. Worm (one of the Futuremark staff) has certainly implied that the 6800s are faster in 3dMark05 due to PS3.0 being used in every test as a method of accelerating the rendering. The X800s don't have that. (Granted, his comments were along the lines of "It doesn't give as much of a lead as you'd think", but I'd hardly take that to mean "It's way the hell behind ATI's score").

Especially given that 3dMark05 isn't using 3dc.

Either TheInq is just spouting BS, or ATI is doing something VERY fishy in their new drivers.

Here's an interesting thread over at Rage about the new beta drivers.....
That kind of increase does seem a little suspicious. ;)

http://www.rage3d.com/board/showthread.php?t=33783538
 
The countdown continues....drum roll.....drums drumming...Well just have to wait and see. Is 3dmark2005 going to be any better at gauging realtime gameplay than any other synthetic benchmark? Is Nvidia going to beat ATI? Who cares!
 
Agreed. 3dMarkXX is not good for actually benchmarking, rather great eye candy and I good idea of what your card can handle.

I don't care who wins, but I do find it strange that an old chipset can get 1000+ boosts with a 'magical' beta driver releasing just prior to 3dMark05's release. Common sense, points me in one direction ... :/
 
Hmm.. Hopefully I can pull 3500+ on my x800 pro. That would be cool just to at least see it in motion as opposed to a slide show. I HATE slideshows, then again, I don't think anyone likes them. So.. yeah. heheh :(
 
just for the sake of it:

what if the new ATI drivers are finally those magical drivers they've been searching for? hey, just saying.
 
well if they are doing it in a synthetic benchmark thats unethical, and there is always a difference in IQ, as at Rage 3d they were saying the gloss level changes :). If they truelly found a way to replace shaders and get away with it, that 5800 score, if its true, will be highly scrutinized.

ATi has to do something nV's refreshes are around the corner ;)
 
rancor said:
well if they are doing it in a synthetic benchmark thats unethical, and there is always a difference in IQ, as at Rage 3d they were saying the gloss level changes :). If they truelly found a way to replace shaders and get away with it, that 5800 score, if its true, will be highly scrutinized.

This could get interesting. Weren't these drivers supposed to be a "hotfix" for a specific game (KOTOR, I believe)? I saw another thread there (Rage) where a couple of people benchmarked these drivers with several games. The improvements in FPS were modest at best.

I think we are overdue for a little scandal. Pretty boring lately. ;)
 
LOL yes they were, scandal god no too many flamewars lol, but anyways who knows, just have to see tomorrow :)

thats 30% increase in perfromance is just something crazy for a mature driver set, can understand if there was a major bug, but really you think Futuremark would have notfied ATi about a bug long before these drivers were even in the process of being made. And then why wouldn't that bug be noticable in other games.
 
The hotfix also added catalyst ai to the drivers, which would be the reason for the performance increase. Several games have seen 30%+ increases, if you are to believe the benchmarks people throw around.
 
I think 3DMark05 will convince me to overclock my BFG 6800 GT since I was happy getting over 10000 on 03.
 
if thats the case then they are doing shader replacement and this could get ugly. Hopefully they aren't on by default, for 3dmark05
 
rancor said:
LOL yes they were, scandal god no too many flamewars lol, but anyways who knows, just have to see tomorrow :)

thats 30% increase in perfromance is just something crazy for a mature driver set, can understand if there was a major bug, but really you think Futuremark would have notfied ATi about a bug long before these drivers were even in the process of being made. And then why wouldn't that bug be noticable in other games.

Besides KOTOR, these drivers are supposed to fix some kind of memory allocation bug.
http://www.driverheaven.net/#news57116

How fortunate that ATI was able to release this revision just in time for 3DMark05.
 
I will give ATi the benefit of the doubt, although I don't trust the inq much. If the drivers get futuremark approved that would clear them in my eyes. It's always possible that the old set of ATi drivers had a bug. Anyways, I don't see why it would surprise anyone if ATi outperformed nVidia in 3Dmark, I mean before Doom III the X800 XT was pretty much declared faster by most sites, and even after the SM3.0 patch for although nVidia closed the gap a bit in Direct3D ATi still generally did slightly better in Direct3D.
 
Particleman said:
I will give ATi the benefit of the doubt, although I don't trust the inq much. If the drivers get futuremark approved that would clear them in my eyes. It's always possible that the old set of ATi drivers had a bug. Anyways, I don't see why it would surprise anyone if ATi outperformed nVidia in 3Dmark, I mean before Doom III the X800 XT was pretty much declared faster by most sites, and even after the SM3.0 patch for although nVidia closed the gap a bit in Direct3D ATi still generally did slightly better in Direct3D.


for how many years have they had this bug? They have been using 256 mb cards for 3 years now.
 
I was suggesting more a bug specific to 3Dmark 05. But who knows what the increase might have come from. But ATi has stated that their app specific optimizations are for games only, so until it is proven otherwise, I will give them the benefit of the doubt.

I btw own a 6800 GT but I don't see why people have to suddenly view everything ATi with greater criticism.
 
I'd bet its because ATi has not driver coded the 2.0b part of the X800 into the current Catalysts yet (still rumored to be only 2.0 original). IE: A X800 with 8 pipes is most definitely faster than a 9800 with 8 pipes at the same Mhz speed, if you have the drivers set to use the maximum shader model for the respective cards that is.

ATi probably felt no need to code in full 2.0b driver support yet, Surprisingly enough noone seems to notice that it still is running 1.4/2.0 unless the application itself specifically calls it, or uses its own set of code. I'd almost think ATi was *waiting* for a benchmark or native game to use PVS2.0b or 3.0 before releasing a driver set that turns it on, just to say *nyah* look its just as fast/if not faster than before. Sort of like Scotty, hes only gonna give you extra power when you "need" it, if you don't ask for it ya don't get it.

The Radeon 9700 runs circles around a 5950 in DX9+ equivalent games, I don't think many people would argue that. But it is strange to see a 9600XT (128-bit memory) outperforming a 5950.

Also: One thing I do find odd is how magically when we are talking pixel and vertex shaders, NVidia always refers to it abbreviated as PS and not PVS. ATi doesn't really shorten it to anything, but also ATi does seem to be soaking up much more transistor space for vertex shaders than Nvidia, even though its mostly dormant. Maybe ATi is thinking there will be more vertex complex apps/games in the future?
 
Particleman said:
I was suggesting more a bug specific to 3Dmark 05. But who knows what the increase might have come from. But ATi has stated that their app specific optimizations are for games only, so until it is proven otherwise, I will give them the benefit of the doubt.

I btw own a 6800 GT but I don't see why people have to suddenly view everything ATi with greater criticism.

Its like that with any card that gets a 30% boost out of nowhere.
 
rancor said:
for how many years have they had this bug? They have been using 256 mb cards for 3 years now.

True but the x800 has a much more versatile memory controller than the 9800 series did.
 
Particleman said:
I was suggesting more a bug specific to 3Dmark 05. But who knows what the increase might have come from. But ATi has stated that their app specific optimizations are for games only, so until it is proven otherwise, I will give them the benefit of the doubt.

I btw own a 6800 GT but I don't see why people have to suddenly view everything ATi with greater criticism.

If the numbers that the Inquirer quoted for the new drivers in 3DMark05 vs the previous drivers are accurate, then it's somewhat suspicious. From 4500 to 5800?? That's a huge jump, don't you think?

Like rancor said, we'll know much more a a couple of days. I'm sure there will be no shortage of articles on this new benchmark.
 
poppachocks said:
If the numbers that the Inquirer quoted for the new drivers in 3DMark05 vs the previous drivers are accurate, then it's somewhat suspicious. From 4500 to 5800?? That's a huge jump, don't you think?

If you look at some of the benchies with catalyst ai on the increase is just as large. I would assume the same default filtering is going on in all apps.
 
rancor said:
Its like that with any card that gets a 30% boost out of nowhere.

First of all we're all asuming that the Inq is accurate to do that you would have to assume that the inqs original numbers were correct. It would surprise me more if the X800XT PE performed slower than the 6800 GT, not the other way around I mean how many games do you see that in besides Doom III. Anyways like I said this debate is a bit iffy, since the whole thing requires you assume the Inq is correct.
 
ZenOps said:
I'd bet its because ATi has not driver coded the 2.0b part of the X800 into the current Catalysts yet (still rumored to be only 2.0 original). IE: A X800 with 8 pipes is most definitely faster than a 9800 with 8 pipes at the same Mhz speed, if you have the drivers set to use the maximum shader model for the respective cards that is.

ATi probably felt no need to code in full 2.0b driver support yet, Surprisingly enough noone seems to notice that it still is running 1.4/2.0 unless the application itself specifically calls it, or uses its own set of code. I'd almost think ATi was *waiting* for a benchmark or native game to use PVS2.0b or 3.0 before releasing a driver set that turns it on, just to say *nyah* look its just as fast/if not faster than before. Sort of like Scotty, hes only gonna give you extra power when you "need" it, if you don't ask for it ya don't get it.

The Radeon 9700 runs circles around a 5950 in DX9+ equivalent games, I don't think many people would argue that. But it is strange to see a 9600XT (128-bit memory) outperforming a 5950.

The problem is the shaders have to be written for the ps 2.0b path for it to work, just like sm 3.0
 
Particleman said:
First of all we're all asuming that the Inq is accurate to do that you would have to assume that the inqs original numbers were correct. It would surprise me more if the X800XT PE performed slower than the 6800 GT, not the other way around I mean how many games do you see that in besides Doom III. Anyways like I said this debate is a bit iffy, since the whole thing requires you assume the Inq is correct.


True
 
Particleman said:
First of all we're all asuming that the Inq is accurate to do that you would have to assume that the inqs original numbers were correct. It would surprise me more if the X800XT PE performed slower than the 6800 GT, not the other way around I mean how many games do you see that in besides Doom III. Anyways like I said this debate is a bit iffy, since the whole thing requires you assume the Inq is correct.

How many SM3 games do you see. None. So who knows what happens if SM3 is being used aggresively.
 
rancor said:
The problem is the shaders have to be written for the ps 2.0b path for it to work, just like sm 3.0

3dmark05 has 2.0, 2.0b and 3.0 paths.
 
101998 said:
3dmark05 has 2.0, 2.0b and 3.0 paths.

yes and the path had to be there before since, far cry was able to use thier ps 2.0b path in patch 1.2
 
The Batman said:
How many SM3 games do you see. None. So who knows what happens if SM3 is being used aggresively.

The only reference we have at the moment is the Farcry patch that was recalled at this point using that as a reference, SM3.0 it certainly didn't allow the GT to get anywhere close to the XT PE there. Anyways like I said, this is the Inq we are talking about here these are the same guys who failed to mention that their 6800 3Dmark numbers were run at 800x600 in the time leading up the launch of the 6800.

I for one don't care much about 3Dmark scores anyways, I wouldn't care who stomped who in 3Dmark, I just want nVidia to fix the texture aliasing bug, which makes a lot of games look horrible.
 
dderidex said:
Uhhhh....I find that a *little* difficult to believe. In fact, quite impossible. Worm (one of the Futuremark staff) has certainly implied that the 6800s are faster in 3dMark05 due to PS3.0 being used in every test as a method of accelerating the rendering. The X800s don't have that. (Granted, his comments were along the lines of "It doesn't give as much of a lead as you'd think", but I'd hardly take that to mean "It's way the hell behind ATI's score").

Especially given that 3dMark05 isn't using 3dc.

Either TheInq is just spouting BS, or ATI is doing something VERY fishy in their new drivers.

Friggin drivers are not even out yet and already ATi is cheating...
:rolleyes:
 
This suggests 3D05 will be every bit as useless an indication of actual gaming performance as its predecessors.

As 3D03 scores show up in nearly every review on the net, it is no real stretch to predict that 3D05 scores will as well.

The only winner is the Futuremark corporation.
 
The only thing 3DMark is good for is optimizing your own system. Usless for comparing to other hardware.

I only use it to optimize OC settings. Once I hit a max on a synthetic I never touch it again. Until I get new hardware.
 
Particleman said:
The only reference we have at the moment is the Farcry patch that was recalled at this point using that as a reference, SM3.0 it certainly didn't allow the GT to get anywhere close to the XT PE there. Anyways like I said, this is the Inq we are talking about here these are the same guys who failed to mention that their 6800 3Dmark numbers were run at 800x600 in the time leading up the launch of the 6800.

I for one don't care much about 3Dmark scores anyways, I wouldn't care who stomped who in 3Dmark, I just want nVidia to fix the texture aliasing bug, which makes a lot of games look horrible.

From what I hear Far Cry wasn't really SM3 heavy. Just threw in a few features here and there. As for the aliasing bug [I don't have it myself thankfully], lower your LOD. It's a patch fix until Nvidia does something about it.
 
Be that as it may, it is the only point of reference to go on at this moment in time and it certainly more shader intensive than 99% of the games out there. As for the LOD workaround, it isn't really a fix as it blurs up all textures, it's like trading one problem for another.

Anyways back to 3Dmark 05, I certainly don't think ATi is dumb enough to put in app specific optimizations for 3Dmark 05 after they explictly said that they would not use benchmark specific optimizations, and I it is wrong to acuse them of doing so with no evidence to back it up.
 
Back
Top