Nvidia cheats on 3DMark with 177.39 drivers

Haha so Charlie thinks installing a new PhysX driver equates to "doing a different workload" ? It was funny at first but for a grown man to be such an imbecile is coming off as a little pathetic now.

If anyone is at fault it's Futuremark. The physics test in Vantage isn't a CPU test. It's a PhysX test. All hardware capable of running the API is fair game - CPUs, PPUs, GPUs....the whole lot.

im with him.

if a piece of hardware runs Physics then thats that. i dont care if its my CPU, GPU or LAN card. if it can do it, i dont consider it cheating. If my Video card can do physics better then my cpu i would want IT doing it instead.

But i agree with everyone else in this tread 3DMark is shit.
 
Who cares, 3dMark is a fake benchmark anyways. Being able to 'cheat' at it, alone, is a good enough reason to prove it's score holds no water vs actual performance.
Yet, it's being run in almost every review...:rolleyes:
For some reason, there's a large group of people who simply refuse to accept the truth. Sorta like the people who support Al Gore.

3d mark = global warming!
 
I guess you've never visited xtremesystems. Or read the million threads here over the years comparing 3dmark scores. I guess they were all imaginary people then.

3dmark itself is flawed. They created individual tests that they sum up to a final score. If they wanted to simulate real world performance they would have one combined CPU/GPU/Physics/AI test using a real game engine.

It's in no way "unfair" to have the GPU run the PhysX test because they designed it that way in the first place.

What happends if the end user never playes the game used? How is that "real world"? You can't use a retail game engine for that reason alone really. Also how is reviewing in the highest possible playable resolution real world? Benchmarking in the highest resolution LCD monitors can render is not real word by any stretch! lol So bashing 3dmark by saying it isn't real world is just not real bright if you get my drift =)
 
i think different game benchies used in conjunction with 3dmark scores can help a reader come to a solid conclusion in regards to the general performance of a specific video card. 3dmark can be helpful sometimes if you don't play any of the games being used in reviews as it can give some semblance of how it should perform overall, and if there might be something wrong with your card or if you have bottlenecks in your system. for instance, if you get an 8800gt and it scores only 2,000 3dmarks instead of around the 10,000 range depending on your system, then you know there might be a problem. overall though, i definitely wouldn't emphasize 3dmark as the sole program to judge all video cards.
 
What happends if the end user never playes the game used? How is that "real world"? You can't use a retail game engine for that reason alone really. Also how is reviewing in the highest possible playable resolution real world? Benchmarking in the highest resolution LCD monitors can render is not real word by any stretch! lol So bashing 3dmark by saying it isn't real world is just not real bright if you get my drift =)

game's are real, you play them, 3dmark you cant play, you can do jack with it, but run it and watch the purdy colors.

i think 3dmark is useful to test your own system to see ow it performs vs your own upgrades or overclocks.

put it this way if 3Dmark was the end all of what card to buy, EVERYONE would of owned a 2900 series card... since it creamed NVIDIA in 3DMark, but in real games - it got it's ass handed to it...
 
What happends if the end user never playes the game used? How is that "real world"? You can't use a retail game engine for that reason alone really. Also how is reviewing in the highest possible playable resolution real world? Benchmarking in the highest resolution LCD monitors can render is not real word by any stretch! lol So bashing 3dmark by saying it isn't real world is just not real bright if you get my drift =)

Oh god your not still on about that tripe are you mal?

Are you drinking again?

I'm going to answear these in turn (no I don't expect you to actually listen) :

What happends if the end user never playes the game used? How is that "real world"?

The games chosen by [H] are likely the most graphicly intensive games available at the time they run their benchmarks, while this doesn't cover all games and game engines it does give you a pretty good idea of how other, less graphicly intensive games will do using similar or possibly even higher settings than those used.

You can't use a retail game engine for that reason alone really.

To base an actual hardware purchase on, yes you can. See above paragraph it applies here too.

Also how is reviewing in the highest possible playable resolution real world

Because it allows you to enjoy the game at the higest possible settings without being distracted by massive framerate dips. Which in and of itself is subjective, however given the experience of the people behind the reviews, they are usually spot on with their suggested settings, and you may notice that sometimes they will drop resolution in order to enable other higher graphical settings just to keep the game looking good and enjoyable to play.

Benchmarking in the highest resolution LCD monitors can render is not real word by any stretch!

They don't always benchmark at the highest resolution, if you actually read one of the reviews you might have noticed that.

So bashing 3dmark by saying it isn't real world is just not real bright if you get my drift =)

Ok I tell you what, go out and buy a Radeon 2900XT watch your 3D Mark numbers get higher than an 8800 GTX. Based on your line of thought, 3D Mark results should be the be all end all of your way to purchase a video card. After you have the card, go play some games on it, maybe even the same games that [H] uses in their review, and try running at the same resolution and settings that Brent and Kyle use in those reviews for the 8800 GTX. You may notice a little something. IT SUCKS, and is not very enjoyable to play at those same settings. Why? Because 3D Mark IS NOT AN ACTUAL GAME.

And nobody on the planet gets your drift, if they did then the world will shortly come to an uncerimonious end with the words "Doer of all thigs stupid" being the last thing anybody will hear.

:D

Going to add an edit here to stay on topic:

I think futuremark needs to rethink their benchmark in the area of physics. Since Nvidia bought PhysX and Intel bought up Havok, I think that a combined benchmark covering both GPU and physics at the same time as well as CPU and Physics at the same time should equalize the field a bit. With the current separation between the individual benchmarks it is easy to see how Nvidia could put all the processing power of their video card into just physics with no actual graphics load on the gpu, while a cpu is stuck holding the bag for physics and the running of anything else that is using the cpu as well.

As for 3D Mark itself, it is currently worthless in terms of having any real meaning to basing your video card purchase on.
 
People really need to stop posting inquirer links...
This Charlie character has proven that he knows very little about what he talks about and his smearing campaign of late, is already over the top.
 
What a load of bullshit.

Nvidia specifically bought out the Physx API to finally move physics calculations to the GPU where we knew they would be calculated faster due to the parallel processing available.

The developers behind 3Dmark are shittin' it becuase they've designed a benchmark where one team has developed a much better way of running these particular features and all of a sudden it's not fair that Nvidia are getting better scores...Yet they spent all that money aquiring Ageia and further developing the API to run on their video cards for the benefit of the gamer and they get ripped on because they're doing better in the benchmark.

This is just another example of why 3dmark is completely and utterly rubbish, they setup completely artificial benchmarks and no one can display any finese in how they acomplish the task. Benchmark A is GPU, Benchmark B is CPU and if you try to use the wrong bit of hardware for one of the tests then it disqualified, even if thats just a much more elgoant way of acomplishing the task.

Nvidia didn't spend all that money for nothing, this sort of physics acceleration is going to be seen in games eventually and when it is, the 3DMark benchmarks are just going to be that much less accurate.

3DMark and The Inq can both go to hell.
 
What a load of bullshit.

Nvidia specifically bought out the Physx API to finally move physics calculations to the GPU where we knew they would be calculated faster due to the parallel processing available.

The developers behind 3Dmark are shittin' it becuase they've designed a benchmark where one team has developed a much better way of running these particular features and all of a sudden it's not fair that Nvidia are getting better scores...Yet they spent all that money aquiring Ageia and further developing the API to run on their video cards for the benefit of the gamer and they get ripped on because they're doing better in the benchmark.

This is just another example of why 3dmark is completely and utterly rubbish, they setup completely artificial benchmarks and no one can display any finese in how they acomplish the task. Benchmark A is GPU, Benchmark B is CPU and if you try to use the wrong bit of hardware for one of the tests then it disqualified, even if thats just a much more elgoant way of acomplishing the task.

Nvidia didn't spend all that money for nothing, this sort of physics acceleration is going to be seen in games eventually and when it is, the 3DMark benchmarks are just going to be that much less accurate.

3DMark and The Inq can both go to hell.

I agree with you there.

What if:
Run a dual pci-express motherboard and (say) buy an ATI 4 series to play my video games. I think it will be great if Nvidia allows users to install PhysX drivers only to recognize PhysX enabled cards (that includes the GeForces) and install such a card in the second graphics slot. I think this is great for consumers with current 8 and 9 series graphics cards (but probably not in the minds of Nvidia as far as money goes...:rolleyes:)
 
3dmark is a POS program that basicly means little to zero of what we all experience in games. Ati has been cheating at 3dmark for years now so I couldn't care less if InQ is pointing fingers at either side. Futuremark is total sold for company that will basicly tailor there software to suit the needs of the biggest donor , if you cant see that then you're a fool , its been proven time and time again (on this site no less) that 3dmark stands for absolute shit in real world performance.


People who dilute themselfs into believing that program means anything other than eye candy for acid freaks needs to be slapped on end for a few hours.
 
3Dmark has always seemed to be more biased. What I hated about them, it wasn't about how good your computer was, it was about the little tweaks that you made which in reality made no real world difference. Stuff like increasing your FSB or adding a physx card so you can get the physics points, etc... that made it lame.

It's all about now who pays 3Dmark more.
 
this isnt the first time nvidia has cheated on 3dmark.. remember waaay back when with the fx series? the last time they put out a flunk product? they just realized they lost big time again to ati and they are desperate again maybe this will contribute to another big jump like the geforce fx to geforce 6 transition?
 
Haha so Charlie thinks installing a new PhysX driver equates to "doing a different workload" ? It was funny at first but for a grown man to be such an imbecile is coming off as a little pathetic now.

If anyone is at fault it's Futuremark. The physics test in Vantage isn't a CPU test. It's a PhysX test. All hardware capable of running the API is fair game - CPUs, PPUs, GPUs....the whole lot.

Missed this the first time, about as well put as it gets.
 
Piss on Nvidia and their development of a proprietary API. Anything proprietary screws us for their own proffit.
 
Piss on Nvidia and their development of a proprietary API. Anything proprietary screws us for their own proffit.
Huh? Almost all gaming middleware used in commercial games is "proprietary." Even Havok. :rolleyes:

AMD is the one who rejected nvidia's (free) offer of GPU PhysX support, so if you want to get mad at anyone for lack of GPU PhysX, blame AMD.
 
Piss on Nvidia and their development of a proprietary API. Anything proprietary screws us for their own proffit.

This is beyond the scope of my initial argument.

First off they didn't make it, Ageia did, they simply bought it. It was built from the ground up to specifically support physics processing from any piece of hardware be it the CPU, or PPU and now the GPU.

Someone has to make a move before phyics become calculated off the CPU, Ageia tried with the PPU and that was a failure because you needed a seperate piece of hardware which not all gamers had, so you had a chicken/egg scenario where you needed games to justify the hardware and hardware (or uses of it) to justify the support in games, thats a tough one to get off the ground.

Everyone into games has a GPU, if AMD has any sense it will make their cards capable of the same physics processing with the PhysX API then gamers can play with any modern video card and reap the benefits.
 
Yet they spent all that money aquiring Ageia and further developing the API to run on their video cards for the benefit of the gamer and they get ripped on because they're doing better in the benchmark.

For the benefit of the gamer? LOL
 
This is beyond the scope of my initial argument.

First off they didn't make it, Ageia did, they simply bought it. It was built from the ground up to specifically support physics processing from any piece of hardware be it the CPU, or PPU and now the GPU.

Someone has to make a move before phyics become calculated off the CPU, Ageia tried with the PPU and that was a failure because you needed a seperate piece of hardware which not all gamers had, so you had a chicken/egg scenario where you needed games to justify the hardware and hardware (or uses of it) to justify the support in games, thats a tough one to get off the ground.

Everyone into games has a GPU, if AMD has any sense it will make their cards capable of the same physics processing with the PhysX API then gamers can play with any modern video card and reap the benefits.

To add a little to this... When Ageia released the PhysX PPU, there were no games that supported it and the ones that did, only used effects physics such as GRAW. The lack of support from game developers is what became Ageia's demise. Great idea, poorly executed and I am actually glad nVidia purchased them because I would like to see more support for PhysX in games...no matter what camp it comes from.
 
Yeah it benefits us, we get a much better way of processing physics instructions thats faster and allows us to free up CPU time for things like better AI etc.

Has it cost us anything extra? No, the price of Nvidia hardware is approx the same as before.
 
Like I said before, judging cards with 3DMark is like a woman that uses a ruler measurement to decide if a man is good in bed.

The only way to determine that is real world gameplay.

PS: Rulers are good at measuring lenght, no argument there. But says nothing about how "something"or someone will perform in real game play.
 
I guess I've been under a rock when it comes to 3DMARK. I didn't realize it was considered so useless.

So, what then is considered a GOOD benchmark utility for evaluating PC performance?
 
I'm not defending NVIDIA's actions on this (If NVIDIA did in fact try and cheat the software) but 3D Mark doesn't really generate results that correlate to game performance or anything else for that matter. ATI's cards score really well against NVIDIA's cards even though (as of now) most ATI cards are slower than their NVIDIA counterparts. It doesn't surprise me that NVIDIA would try and cheat the program to get better results in the benchmark. NVIDIA knows that many people put alot of stock in 3D Mark scores and they want to show their cards in a better light.

Personally I don't care. This isn't the first time ATI or NVIDIA has been caught with their hands in the cookie jar. It isn't likely to be the last time either.
 
So, what then is considered a GOOD benchmark utility for evaluating PC performance?
It depends on what you want to benchmark. I don't think there are any single benchmarking utilities that give you a good overall impression. That's why reviews usually run many benchmarks, especially when comparing between different CPU or GPU manufacturers. One may dominate the other in everything, but that's not usually the case. You can find strengths and weaknesses in each product. You need to weigh which strengths apply to your usage to find the product that's best for you.
 
Don't tell me you'd actually back Charlie up Dan. Nvidia haven't been caught doing anything.

3DMark Vantage supports PhysX.
So does a lot of other things.
Nvidia released a driver that supports GPU PhysX. ATI could also do so if they wished. Nvida have made it an open platform.

Is using a PhysX card cheating as well?
 
I guess I've been under a rock when it comes to 3DMARK. I didn't realize it was considered so useless.

So, what then is considered a GOOD benchmark utility for evaluating PC performance?

3D Mark is pretty useless these days. It's scores don't correspond or translate to anything relating to PC gaming. You can't say I got an extra 400points in 3D Mark 2006 which means my gaming experience will improve by <Insert numbers here>. The slower ATI cards actually score better than the faster NVIDIA cards do. The program also is influenced too much by CPU clock speeds and CPU types. A 600MHz overclock can make a massive change in your 3D Mark scores and do almost nothing for actual game performance. I used to consider it a good tool for tuning your own system until I realized the results you get don't mean anything. They are arbitrary. 22,000 points in 3D Mark doesn't mean 60FPS in Crysis at 2560x1600. It doesn't mean anything.

As far as good benchmark tools for evaluating PC gaming performance, there isn't one. Like all synthetic tests PC Mark, Sandra, and others don't have anything to do with application performance. Sandra shows AMD processors have a ton more memory bandwidth than Core 2 Duo processors do. While this is true that alone doesn't translate into better performance. Real world application testing is the only way to produce meaningful results.
 
Don't tell me you'd actually back Charlie up Dan. Nvidia haven't been caught doing anything.

3DMark Vantage supports PhysX.
So does a lot of other things.
Nvidia released a driver that supports GPU PhysX. ATI could also do so if they wished. Nvida have made it an open platform.

Is using a PhysX card cheating as well?

I don't know if they did or not. I simply don't care if NVIDIA did cheat in 3D Mark. 3D Mark is fairly useless so it wouldn't make any difference as far as I am concerned. I can understand why they would attempt to cheat in the program and they've done it in the past.

EDIT: I see, this is about PhsyX support and a corresponding increase in 3D Mark scores because it detects PhysX support. Well I'm all for it. If that's the way 3D Mark works then that's how it works. I guess it would have helped if I had read the whole thread before posting. :D
 
3D Mark is pretty useless these days. It's scores don't correspond or translate to anything relating to PC gaming. You can't say I got an extra 400points in 3D Mark 2006 which means my gaming experience will improve by <Insert numbers here>. The slower ATI cards actually score better than the faster NVIDIA cards do. The program also is influenced too much by CPU clock speeds and CPU types. A 600MHz overclock can make a massive change in your 3D Mark scores and do almost nothing for actual game performance. I used to consider it a good tool for tuning your own system until I realized the results you get don't mean anything. They are arbitrary. 22,000 points in 3D Mark doesn't mean 60FPS in Crysis at 2560x1600. It doesn't mean anything.

As far as good benchmark tools for evaluating PC gaming performance, there isn't one. Like all synthetic tests PC Mark, Sandra, and others don't have anything to do with application performance. Sandra shows AMD processors have a ton more memory bandwidth than Core 2 Duo processors do. While this is true that alone doesn't translate into better performance. Real world application testing is the only way to produce meaningful results.

I don't know if they did or not. I simply don't care if NVIDIA did cheat in 3D Mark. 3D Mark is fairly useless so it wouldn't make any difference as far as I am concerned. I can understand why they would attempt to cheat in the program and they've done it in the past.

EDIT: I see, this is about PhsyX support and a corresponding increase in 3D Mark scores because it detects PhysX support. Well I'm all for it. If that's the way 3D Mark works then that's how it works. I guess it would have helped if I had read the whole thread before posting. :D

yeah, people should realize ati users can boost their scores in 3dmark with a ppu as well if they wanted to...
 
Yeah it benefits us, we get a much better way of processing physics instructions thats faster and allows us to free up CPU time for things like better AI etc.

Has it cost us anything extra? No, the price of Nvidia hardware is approx the same as before.

Yeah, don't buy just one of our cards, don't buy two of our cards (SLI), buy THREE of our cards (SLI+PhysX). We don't want those extra CPU cores break a sweat or anything, do we? Oh, and your older card is not supported, sorry. Get the latest model. And while you're at it, get the latest mainboard with our chipset, of course. And we do it all just for you, the gamer.

It doesn't cost us anything extra? That's even more funny than "for the benefit of gamer". Give me a break.
 
3D Mark is pretty useless these days. It's scores don't correspond or translate to anything relating to PC gaming.

QFT, except without the "these days". I stand by my decision in 2001 that 3Dmark wasn't worth the download for a hardware comparison. I still installed the various versions, but just for the demos... that's about all they were good for.

heh, my new cpu score in Vantage is 30,963, with my GPU only scoring 4829 (P6120 overall). I think the gist of all this is that futuremark WAAAAY over valued the affect of physics calculations on gameplay... In games, cpus are mostly used to drive data to the video cards, not for intensive calculations like physics.
 
It works with a single card, and those cards (G80 series and newer) have been available for over 18 months.

It theoretically works with one card. I would wait for some games which actually use these physics features to a useful level (and I don't mean GRAW level of usefulness) before making such bold statements. But I doubt we'll see a major shift in this area before there is some kind of unified physics API.
 
It theoretically works with one card.
It's already been tested with one card in UT3, with massive speedups (~3x) on the PhysX levels. Geez, just read the articles already.
 
Back
Top