Geforce 6800Ultra / Forceware 60.72 cheatin' rulez again !

Hmm.. something is not right???:confused:

It looks like ATI is also doing something because their
pictures are not 100% either. My best guess is that
the software used for this test was not working right.

What I'd call cheating is if the drivers detect 3dMark
running and change things around in another way. Other
than that this could just be normal optimizations.

We should wait for Nvidia to comment on these drivers.

And also we should ask why FutureMark said these drivers
from Nvidia are okay, if they are cheating?
 
Originally posted by Tivon
Other
than that this could just be normal optimizations.

We should wait for Nvidia to comment on these drivers.

?


This is some awesome pump m8 , last revision :

4679.jpg
 
Originally posted by Tivon
Hmm.. something is not right???:confused:

It looks like ATI is also doing something because their
pictures are not 100% either. My best guess is that
the software used for this test was not working right.

What I'd call cheating is if the drivers detect 3dMark
running and change things around in another way. Other
than that this could just be normal optimizations.

We should wait for Nvidia to comment on these drivers.

And also we should ask why FutureMark said these drivers
from Nvidia are okay, if they are cheating?

You REALLY need to read that article again. :rolleyes:
 
That article is total toss. If this was April 1 I'd say it was a joke, because if you read it and look at the pictures the pictures do NOT support their conclusion at all.
 
What I'd like to know is what kind of fucking moron gives a crap that Nvidia is cheating in Synthetic benchs? Was I the only one who skipped over 3Dmark in all the 4/14 previews [and I'll do the same come the 26th]? The NV40 performs amazingly well with you know...real games, so who gives a flying fuck about futuremark anyway?
 
You'd think that after all the NV3x has been though, NV would get rid of the optimizations. If FW 60.72 becomes the standard, I'll be very disappoited.

Also, you can't deny that the Max Payne screenie is a little incriminating. I mean, a big block? What for?
 
The reference software rendering device that the DirectX SDK provides developers will never exactly match any hardware. Its primarily used to enable interactive shader debugging, and can also be used as a simple test to make sure you don't have a graphics driver bug (i.e. does your hardware vary wildly from the ref device). It is also painfully slow as it was designed to be a visual example of the D3D specifications. The D3D specs are also not exactly or completely defined in all areas. Some areas are still left up to hardware manufacturers implement and can vary wildly (AA filters, anisotropic methods, etc).

There are also around 250 features that D3D9 allows to be optional, of which a good chunk interesting stuff (~20) do still vary quite a bit from ATI to NVIDIA. If you want to use any of these optional features, you get the pleasure of handling drawing something else if the feature is missing. The REF device has all the optional features, so you have to go out of your way to cripple the REF in your code to match some specific card, which is quite a bit of work and not practical. Some of these optional features come and go with newer drivers, so its also an interesting problem for making a real product and try to play nice with what D3D says the hardware can do.
 
:p WHO CARES its 3DWANK :rolleyes: You buy a damed card to play games not to toss of at 3dmark scores;)
 
^^^
Are they talking about the screenshots from the NV40 launch? I thought it was known that NVidia was comparing PS 1.x to PS 3.0. Does someone have a link to that article?? I could've sworn that NVidia said it was PS1.x and PS 3.0.
 
i remember form the vid that the Crytek guy said 2.0/3.0 together really fast, and the Nvidia guy made it seem more as if it was only 3.0 that could look like that, maybe ATI just trying to make it clear? Although i have to say it is obvious it is not 2.0 vs 3.0 as the 'before' shot looks nothing like the game i played :p.

sorry for spamming articles, but this article seems of interest in light of the 2.0Vs3.0 debate:

http://www.gamers-depot.com/interviews/dx9b/001.htm

sorry if posted before

nana
 
Originally posted by bananaman
i remember form the vid that the Crytek guy said 2.0/3.0 together really fast, and the Nvidia guy made it seem more as if it was only 3.0 that could look like that, maybe ATI just trying to make it clear?

far cry is not a american made game.
that guy was from germany, he could barely speak english :rolleyes:
 
well he made a pretty good speech, and was pretty comfortable speaking english, one doesnt have to be american to speak English (the irony). He knew perfectly well what he was saying, and i am sure he knows the difference between the numbers 2 and 3, he said the 2/3 path, but just quickly

[edit] typos
 
um, my interperitation of the pictures.

it seems nvidia is using low detail (distance) textures closer than they should be used. so more textures are substituted with less detailed ones.

but i cant see the actual pictures, and no one has expressed any significant concern of IQ of the 6800U. so either this issue doesnt matter, or maybe the software cant recognize where the interpolation is being done. maybe this card does a pre interpolation.

if this really is true, i consider it a legitimate optimization and a good one that yields great result both in speed and image quality. kudos to nvidia.
 
ATI also claims that its hardware will run faster then Nvidia's anyway and he added: "It's pretty much impossible to make a sm3.0 game look noticeably different to a sm2.0 game, which is why Nvidia was comparing the 2.0/3.0 path with a 1.1 path."


Oh, this is going to be a good year for graphic cards! Can Ati pull out the unthinkable? I had my doubts. The 6800u looked like it was invincible. But it looks like Ati is out to keep it's reign.
 
And of course they don't have any non-mipmap colored screenshots.
 
Originally posted by obs
And of course they don't have any non-mipmap colored screenshots.

And why would they?

To be clear, I think it is way too early to cry "cheater!" because retail cards with retail drivers are not out. Give them a chance to get it right, by all accounts the nv40 is a big change and the drivers are unlikely to be perfect yet. After the crap that nVidia pulled with the nv3x line, I understand the compulsion to look for cheating, but again, too early for me to consider it fair.....

But back to mip-map levels. Whether differences in screenshots can be easily seen by eye doesn't matter in a benchmark, because the assumed benchmark baseline is that all tested cards are doing the same amount of work. Overriding the developer's settings in any way is cheating.

The situation is different in a game, and I applaude any IHV that can make a game run better without noticable IQ degradation.
 
I'ts a darn shame that the PeterPump ad above is based on fact but this article is not.

I remember when Valve & ATI fanboys were shouting the requirements of PS2.0 & that nVidia was impotent in that dept. But now that nVidia is the PS3.0 king, all of a sudden PS3.0 is not important. What a coincedence.

DH is just trying to increase it's numbers of site visits and the fanboys are running with it.

This BS is the very reason that the [H] previews/reviews the current way. This is case in point.



THE MORAL of the story...
You can put more faith in PeterPump ads than this speculation.
 
Originally posted by davidj
I'ts a darn shame that the PeterPump ad above is based on fact ......

This doesn't help your argument. You really believe they work? :eek:
 
Originally posted by davidj

I remember when Valve & ATI fanboys were shouting the requirements of PS2.0 & that nVidia was impotent in that dept. But now that nVidia is the PS3.0 king, all of a sudden PS3.0 is not important. What a coincedence.

\

Not being critical or anything, but if you do have a detailed look at PS3.0 it isnt much of a leap from PS2.0. Check out the API details for yourself, so you dont have to take anyone else's work but ur own :)
 
[H]ard OCP's review of the 6800u running the latest farcry patch says that even THEY don't know what PS3.0 is doing for the game.... so yeah... I don't really see how it really is all taht huge of a deal, at least in farcry.
 
Originally posted by LabRat
And why would they?

To be clear, I think it is way too early to cry "cheater!" because retail cards with retail drivers are not out. Give them a chance to get it right, by all accounts the nv40 is a big change and the drivers are unlikely to be perfect yet. After the crap that nVidia pulled with the nv3x line, I understand the compulsion to look for cheating, but again, too early for me to consider it fair.....

But back to mip-map levels. Whether differences in screenshots can be easily seen by eye doesn't matter in a benchmark, because the assumed benchmark baseline is that all tested cards are doing the same amount of work. Overriding the developer's settings in any way is cheating.

The situation is different in a game, and I applaude any IHV that can make a game run better without noticable IQ degradation.
Well then maybe the author shouldn't call it an image quality investigation then if he isn't going to include actual images. I also think that gaming benchmarks should be treated the same as games since gaming performance is what they are trying to guage. I guess at the end of the day when I am playing a game it isn't with mipmap levels on. I just want to see how it would affect what I am seeing.
 
Originally posted by obs
Well then maybe the author shouldn't call it an image quality investigation then if he isn't going to include actual images.

Those aren't images? Amazing, and they showed up on my screen and everything! :D
I know what you actually mean, but my point (again) is that in a synthetic benchmark all cards have to do the same work or it is useless. Thus the ability to see the mipmap levels.

I also think that gaming benchmarks should be treated the same as games since gaming performance is what they are trying to guage.[/B]


Well, first, I really consider 3dmark2003 to be a synthetic benchmark, NOT a game benchmark. And frankly, looking at mipmap levels in a game is the same. It is just easier with 3dm2003 because of the ability to define the exact frame.

I guess at the end of the day when I am playing a game it isn't with mipmap levels on. I just want to see how it would affect what I am seeing. [/B]


You certainly are playing with mip levels on. Coloring them simply makes the levels more apparent. If this were a game, I would agree with you and would want to see the original images. As it is a benchmark, however, even slight deviations from the developer's requested levels are worth investigating. But doing the investigation with a preview engineering sample crosses the line. They should have waited for the retail card/driver.
 
Originally posted by LabRat
Those aren't images? Amazing, and they showed up on my screen and everything! :D
I know what you actually mean, but my point (again) is that in a synthetic benchmark all cards have to do the same work or it is useless. Thus the ability to see the mipmap levels.



Well, first, I really consider 3dmark2003 to be a synthetic benchmark, NOT a game benchmark. And frankly, looking at mipmap levels in a game is the same. It is just easier with 3dm2003 because of the ability to define the exact frame.



You certainly are playing with mip levels on. Coloring them simply makes the levels more apparent. If this were a game, I would agree with you and would want to see the original images. As it is a benchmark, however, even slight deviations from the developer's requested levels are worth investigating. But doing the investigation with a preview engineering sample crosses the line. They should have waited for the retail card/driver.

What I find ironic is that ATi's image isn't perfect either, so they must be doing some sort of optimization too. So why is it that nvidia is cheating and not ATi? If it's a synthetic benchmark, both should be the same as the reference image, right? Neither are, so displaying mipmaps is absolutely pointless. All it tells us is that both cards show different mipmap levels than the reference image. Sure, ATi's is closer, but that type of comparison can only be made with actual screen shots. Since both are doing optimizations, lets compare screen shots.

It's funny how the authored continued on with Max Payne and basically shot himself in the foot. Again, the mipmaps were different, but he even mentioned that there's no way in hell you can tell with the regular screen shots. So what exactly is it that the author is trying to tell us?

I believe this is what he's trying to say:

"Nvidia cheats all the time. I'm gonna search and search and search until I find some sort of proof, and then exploit it to the world!"

It's obvious that the tests were taken to try and proove nvidia wrong. You can take any company at all, and with enough testing you can exploit problems with the company. That's not hard to do.

My question is this: how many real games did they have to test before they came across one (Max Payne) that showed any difference at all? I have a feeling they wasted a shitload of time.
 
#1. I dont think Penis Pumps work, they just make your dick swollen temporarily.

#2. They need to actually show some non-mipmap colored IQ comparisons to show the difference. If the end-product (3D image) is the same it really doesnt matter how they get there.

Are they going to start complaining that occlusion culling is cheating too?... if something that is not visible isnt rendered. Maybe tile based rendering is cheating too? It different methods of rendering which can save bandwidth, increase efficiency, etc.

In the past driver where Nvidia didnt render parts of the sky that werent normally visible but then in free camera mode you could see it wasnt rendered, that was cheating.

However, if in free camera mode, when you looked at the sky it would render it then it would be fine. (if it was a true form of not rendering unseen objects - like a tile based render - sorta)

#3. ATI's image is slightly different too. Yes its closer to the software generated image, but all that means is that ATI probably is just using Microsoft's DirectX/driver/code and may have tweaked it slightly less. None the less its still not a perfect match. Nvidia writes more of their own code from scratch and optimizes it more so it probably does look slightly different, but the IQ I think is the same in the end product.

Anyway, I need to go buy a new penis pump, I outgrew mine.
 
Between this, the ps3.0vsps1.4 bullshit, and the fact that ps3.0 has been pretty much proven useless to us right now. It's shaping up to be one hell of a release day for the x800! WooT!

I love competition. It means I get more for my money!
 
Originally posted by LabRat
Good point. I had forgotten about "brilinear". Is nVidia still calling that a bug? Or is it now a feature? ;)

Apparently its a feature since there's an option to adjust the settings

Anyway, back to the cheating debate: if the differences are so slight that you need colorized mipmaps to detect them, are they really worth fussing about?
 
Originally posted by intercollector
What I find ironic is that ATi's image isn't perfect either, so they must be doing some sort of optimization too. So why is it that nvidia is cheating and not ATi? If it's a synthetic benchmark, both should be the same as the reference image, right? Neither are, so displaying mipmaps is absolutely pointless. All it tells us is that both cards show different mipmap levels than the reference image. Sure, ATi's is closer, but that type of comparison can only be made with actual screen shots. Since both are doing optimizations, lets compare screen shots.

It's funny how the authored continued on with Max Payne and basically shot himself in the foot. Again, the mipmaps were different, but he even mentioned that there's no way in hell you can tell with the regular screen shots. So what exactly is it that the author is trying to tell us?

I believe this is what he's trying to say:

"Nvidia cheats all the time. I'm gonna search and search and search until I find some sort of proof, and then exploit it to the world!"

It's obvious that the tests were taken to try and proove nvidia wrong. You can take any company at all, and with enough testing you can exploit problems with the company. That's not hard to do.

My question is this: how many real games did they have to test before they came across one (Max Payne) that showed any difference at all? I have a feeling they wasted a shitload of time.

If you go to b3d the author is actually discussing the matter a little more in detail. I warn you it's a big thread, but from what he mentioned , it looks like their more to it and he is waiting for some repleys from Nvidia, ATI and FM.
 
Nvidia Engineering vs. Driverhaven interpretation

:p

I'll wait for mature drivers to fix any issues people may think
are bad.

I'll also wait for the 600mhz NV40 :cool:

Mature product = good.

Word on the forums say that ATI is still having drivers issues
with current generation product :p .. who know what gremlins
will come out with R420 ... :p
 
Waaaaa, more crying i see.

Their article reminds me of a user that buys a PC and then cries because the builder is only using 200FSB instead of 205FSB.

Then they admit they cannot tell a differnce in actual gameplay and it takes some colored testing tools to find what THEY call a problem and label it as cheating. LOL, more anti-nvidia whining.

It's the same as the person that whined above because his benchmark programs shows a tiny minute difference in performance between 200FSB and 205FSB. However, company B uses 205FSB so company A is cheating cause they only use 200FSB. Waaaaaaa

Nvidia hating people are out looking for anything they can cry foul about.

Driver Haven interpreted this as a cheat automatically. If you cannot tell a difference in the gameplay and quality (which you CANNOT), then it's not cheating.

Driver Haven seems to be out to cut down anything nvidia does and has an agenda.

Like someone else said. DriverHaven interpretation. Doesn't make it fact or means Nvidia is cheating. Sheesh, I wish these people would go find something else to cry about. It is quite frankly getting old.

For crying out loud, it's the FIRST revision of the drivers for the card.

Like others have pointed out, ATI's pics are not exactly right too but, you don't hear them whining abotu ATI cheating.

People have no life and all they got to do is sit around and try to find every single tiny imperfection in something so they can label Nvidia as cheating. Get a life already.

Last time I checked, you don't play games in colored mipmap mode. If it takes a colored mipmap mode to bring out any differences but, it looks perfectly finr in game and you cannot tell any difference between the 9800 and 6800 in-game, then everything else is irrelevant.

The image quality between actual gameplay screenshots (other than the Farcry bugs) looked exactly alike so, who cares what it looks like in colored mipmap mode. I don't play games that way.

So what if they are doing things a little differently. It's not effecting actual gameplay image quality so who cares.

The only ones that care are the ones that try everything in their power to try and label Nvidia as cheating and Nvidia bashers.
 
1.) Enough talk about PS3.0. No game or card can use PS3.0 till 2006 when DX9.0c is released so good job to Nvidia for designing something we can't use yet and by the time we can use newer hardware will be available that can do it even faster.

2.) ATI cheats too, simply look at the pictures, they don't even meet the format it should be, although you have to admit that the ATI cards look alot closer to what it should be.

3.) Nvidia's cheats are so blatently obvious there should be no one arguing that they aren't. Like wtf is a big green block doing in Max Payne 2? Thats a real world game not a joke of a benchmark.

4.) ATI is probably crying because they know they will get smoked this time around. The only thing that ATI will probably beat the GeForce 6800U is in PCI Express and not the AGP variant video cards.

5.) DH should have gone farther then just 3DMK2003, they should have used a bunch of real world gaming like UT2004, Far Cry, C&C Generals Zero Hour. Now if something was found there like in Max Payne 2, Nvidia would have alot of explaining to do.

6.) 60.72 IS BETA, not what will be released with the video card.

7.) And for all you nvidiots and ATIFANBOY'S out there who think wow, Unreal 3, Farcry, Stalker, that state they support pixel shader 3.0 and your buying a card based on that fact, remember DX9.0C IS NOT RELEASED TILL 2006 SO YOU WILL ONLY BE ABLE TO DO PX2.0 AT THE MAX!
 
Back
Top