timisoarakill
Weaksauce
- Joined
- Jan 17, 2003
- Messages
- 121
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Originally posted by Tivon
Other
than that this could just be normal optimizations.
We should wait for Nvidia to comment on these drivers.
?
I have one JUST like that oneOriginally posted by timisoarakill
This is some awesome pump m8 , last revision :
Originally posted by Tivon
Hmm.. something is not right???
It looks like ATI is also doing something because their
pictures are not 100% either. My best guess is that
the software used for this test was not working right.
What I'd call cheating is if the drivers detect 3dMark
running and change things around in another way. Other
than that this could just be normal optimizations.
We should wait for Nvidia to comment on these drivers.
And also we should ask why FutureMark said these drivers
from Nvidia are okay, if they are cheating?
Originally posted by EQDuffy
You REALLY need to read that article again.
Originally posted by bananaman
i remember form the vid that the Crytek guy said 2.0/3.0 together really fast, and the Nvidia guy made it seem more as if it was only 3.0 that could look like that, maybe ATI just trying to make it clear?
Originally posted by obs
And of course they don't have any non-mipmap colored screenshots.
Originally posted by davidj
I'ts a darn shame that the PeterPump ad above is based on fact ......
Originally posted by davidj
I remember when Valve & ATI fanboys were shouting the requirements of PS2.0 & that nVidia was impotent in that dept. But now that nVidia is the PS3.0 king, all of a sudden PS3.0 is not important. What a coincedence.
\
Well then maybe the author shouldn't call it an image quality investigation then if he isn't going to include actual images. I also think that gaming benchmarks should be treated the same as games since gaming performance is what they are trying to guage. I guess at the end of the day when I am playing a game it isn't with mipmap levels on. I just want to see how it would affect what I am seeing.Originally posted by LabRat
And why would they?
To be clear, I think it is way too early to cry "cheater!" because retail cards with retail drivers are not out. Give them a chance to get it right, by all accounts the nv40 is a big change and the drivers are unlikely to be perfect yet. After the crap that nVidia pulled with the nv3x line, I understand the compulsion to look for cheating, but again, too early for me to consider it fair.....
But back to mip-map levels. Whether differences in screenshots can be easily seen by eye doesn't matter in a benchmark, because the assumed benchmark baseline is that all tested cards are doing the same amount of work. Overriding the developer's settings in any way is cheating.
The situation is different in a game, and I applaude any IHV that can make a game run better without noticable IQ degradation.
Originally posted by obs
Well then maybe the author shouldn't call it an image quality investigation then if he isn't going to include actual images.
I also think that gaming benchmarks should be treated the same as games since gaming performance is what they are trying to guage.[/B]
I guess at the end of the day when I am playing a game it isn't with mipmap levels on. I just want to see how it would affect what I am seeing. [/B]
Originally posted by fugu
maybe they weren't running with trilinear optimizations off
http://hardocp.com/image.html?image=MTA4MTc0NzQ0ODZxTE1PbWV1dFNfM182X2wuanBn
Originally posted by LabRat
Those aren't images? Amazing, and they showed up on my screen and everything!
I know what you actually mean, but my point (again) is that in a synthetic benchmark all cards have to do the same work or it is useless. Thus the ability to see the mipmap levels.
Well, first, I really consider 3dmark2003 to be a synthetic benchmark, NOT a game benchmark. And frankly, looking at mipmap levels in a game is the same. It is just easier with 3dm2003 because of the ability to define the exact frame.
You certainly are playing with mip levels on. Coloring them simply makes the levels more apparent. If this were a game, I would agree with you and would want to see the original images. As it is a benchmark, however, even slight deviations from the developer's requested levels are worth investigating. But doing the investigation with a preview engineering sample crosses the line. They should have waited for the retail card/driver.
Originally posted by LabRat
Good point. I had forgotten about "brilinear". Is nVidia still calling that a bug? Or is it now a feature?
Originally posted by intercollector
What I find ironic is that ATi's image isn't perfect either, so they must be doing some sort of optimization too. So why is it that nvidia is cheating and not ATi? If it's a synthetic benchmark, both should be the same as the reference image, right? Neither are, so displaying mipmaps is absolutely pointless. All it tells us is that both cards show different mipmap levels than the reference image. Sure, ATi's is closer, but that type of comparison can only be made with actual screen shots. Since both are doing optimizations, lets compare screen shots.
It's funny how the authored continued on with Max Payne and basically shot himself in the foot. Again, the mipmaps were different, but he even mentioned that there's no way in hell you can tell with the regular screen shots. So what exactly is it that the author is trying to tell us?
I believe this is what he's trying to say:
"Nvidia cheats all the time. I'm gonna search and search and search until I find some sort of proof, and then exploit it to the world!"
It's obvious that the tests were taken to try and proove nvidia wrong. You can take any company at all, and with enough testing you can exploit problems with the company. That's not hard to do.
My question is this: how many real games did they have to test before they came across one (Max Payne) that showed any difference at all? I have a feeling they wasted a shitload of time.