Modred189
Can't Read the OP
- Joined
- May 24, 2006
- Messages
- 16,324
That's a bad idea, because while Real-world gameplay testing cannot deal with all variables because it accepts those variables as PART of the test, having a cpu-limit would confound your results.Thats fine and all. But the fact remains, most people are not going to run a midrange card in a HIGH END system. This card in a midrange system would give ppl the REAL WORLD performance. Thats what is preached here, isn't it??
I am not starting an argument here, just test the card with what it really is gonna be in.
The CPU can control things like physics etc that can also affect framerates. You want these things to be non-issues, otherwise they can give false framerate dips etc.
Say, when a tank explodes in crysis. You have two things going on:
A: The CPU is controlling the physics of the pieces
B: The gpu is controlling the display of the flames, lighting shadows etc.
if there is a CPU limit on the physics part, the framerates can be artificially lowered, making you think that the card cannot handle the lighting, hdr, shadows etc of the scene. Thus lowering the perceived performance.
MAN, did people not read the whole real-world gameplay article at all?