[H]ard|OCP and Video Card Testing Methods

Major_A

[H]ard|Gawd
Joined
Jun 15, 2004
Messages
1,164
I do like how [H]ard tests video cards. They take the card and run it at the highest playable settings. I am assuming this way the people who read the article see can get an idea of how it will perform in their systems. I do have one complaint, not very many people have setups like the Test Systems in the articles. In the 800XL round up [H]ard used a P4 3.4EE and an Athlon FX-53. I have not seen anyone here at the [H]ard|Forums with a P4 3.4EE in their sig.
Half Life 2 review -> FX-53
6600 GT review -> FX-53
FarCry Patch 1.3 & SM 3.0 -> P4 3.4EE
6800/Ultra Roundup -> FX-53
Okay you get my point. While these processors would no doubt show the best gaming experience I would venture to guess that the average [H]'er is probably using something in the neighborhood of 3Ghz or 3000+.
 
i completely agree..... im only using a 3.06GHz P4, and have nowhere near the amount of money to buy a P4EE or FX-53.
i think that the [H] should bring their test setups down a little bit so they can compare to those of us who are poor. :(
 
The main problem with their article is that not everyone likes to play with the highest possible IQ. Sometimes people don't even have monitors that support those resolutions. In certain games, especially competitive ones in which I have custom configurations, I turn the detail and resolution down. When I want to see a review of a videocard, I like to see raw preformance numbers, none of this "different image qualities, same preformance" nonsense. I won't read them untill they give equal apples-to-apples coverage, or switch it around and make the IQ tests a small part of the review.
 
I'm going to have to agree. According to steam the average gamer is sporting a 2GHz CPU with 512 megs of RAM. I have an athlon xp @ 2.5Ghz and a gig of RAM. Still much lesss than used in the tests... pretty average IMHO.
 
You think they might be using the higher end processor so as not to create a bottleneck and limit the capabilities of the card(s). I would think that would be the reason they use the higher end procs. Maybe they could throw in some benches for your typical machines.

Just my 2 cents.
 
KevinO said:
You think they might be using the higher end processor so as not to create a bottleneck and limit the capabilities of the card(s). I would think that would be the reason they use the higher end procs. Maybe they could throw in some benches for your typical machines.

Just my 2 cents.

i was going to say that.... post benchmarks using pimp rigs, and then benchmark using 'poor mans' rigs.
 
Major_A said:
I do like how [H]ard tests video cards. They take the card and run it at the highest playable settings. I am assuming this way the people who read the article see can get an idea of how it will perform in their systems. I do have one complaint, not very many people have setups like the Test Systems in the articles. In the 800XL round up [H]ard used a P4 3.4EE and an Athlon FX-53. I have not seen anyone here at the [H]ard|Forums with a P4 3.4EE in their sig.
Half Life 2 review -> FX-53
6600 GT review -> FX-53
FarCry Patch 1.3 & SM 3.0 -> P4 3.4EE
6800/Ultra Roundup -> FX-53
Okay you get my point. While these processors would no doubt show the best gaming experience I would venture to guess that the average [H]'er is probably using something in the neighborhood of 3Ghz or 3000+.
Wow I was actually thinking about that today. Thats funny that someone posted it the exact day I was reading the same review and thinking about the same thing. Anyway, yeah, I would like to see how it performs with an AMD 3000+.
 
KevinO said:
You think they might be using the higher end processor so as not to create a bottleneck and limit the capabilities of the card(s). I would think that would be the reason they use the higher end procs. Maybe they could throw in some benches for your typical machines.

Just my 2 cents.

The important thing to recognize when benching current gen vid cards is that the CPU can become the bottleneck in a 'poor man's rig'. In order to really compare what the cards are capable of you need to use the best processor available to eliminate that possibility. At least then you can compare the actual performance of the card.

If you were to see a vid card benchmark thats CPU limited the FPS would be exactly the same among different cards. That isn't informative as a GPU performance measure. But at the same time, seeing benchies on regular rigs would be informative for the majority of gamers. I still prefer the benchies the way they are.
 
it would be cool if they could throw in a few bench on let say a barton 2600+ or p4 2.4c
 
Dark_Orison said:
it would be cool if they could throw in a few bench on let say a barton 2600+ or p4 2.4c
Yeah. Most people have Barton or a 3000+, not everyone can afford an FX-55. I understand they want to cancel out any potential bottlenecks, but I would like to see how a card would perform in my system. Having said that, I think the difference between an AMD 3000+ and FX-55 is about no more then 10FPS. Correct me if I am horrible wrong. :D
 
It isnt just [H] that uses top line CPU for their video reveiws. Most hardware sites do this to eliminate or greeatly reduce the amount of system bottleneck there may be. Also have to remeber that the reviews are showing you what is possible, it is still up to you as a user to extrapolate the performance you may get with any card over another.

While agree it would be interesting to see what some video cards would perform on an average system, I think it would be cost restrictive for many sites to test in this manner as well as using top end rigs to show true video performance.
 
I understand that they are trying to take the CPU out of the equation but when I read this...
Game Evaluation Setup

Please be aware we test our video cards a bit differently from what is the norm. We concentrate on examining the real-world gameplay that each video card provides. Gameplay includes performance and image quality evaluation.
I don't feel like it does really represent the "typical" gamer. Aren't they trying to show what a gamer should expect with these cards? Seems to me that you would want to use an above average processor to give readers a real idea.
 
i don't really think cpu matters because it's not as if [H] is trying to create a database of video cards and corresponding resolution/IQ settings. if in a pimp rig, the newest ati card is faster than the newest nvidia card in most games, it will be the same in your box. granted, the performance will be worse but the difference between the two cards will not change. matter of fact, if the cpu were to be scaled down to more average levels, chances are it would become the bottleneck and new high end cards would all have the same or very similar frame rates. this would create a misleading article, if not a useless one

also, if you look at systems that contain the latest and greatest, they do often have at least a 3500+ and many have an fx chip.
 
well at high resolutions... an athlon 64 3000+ compared to an fx-55 is only by a few frames
 
I would think that by useing the fastest CPU you are truely testing the video card. When useing a slower CPU you run into the chance of the CPU being a limiting factor in the benchmarks.
For example, if you ran a series of benches useing a 1ghz CPU, useing a 9200, 9600 9800Pro, x800XT, 6800GT, and 6800Ultra. at some point the CPU wont be able to push enough Poly's to the video card and the Frames per second will top out. this will show up in the Top performing Video Cards all running at about the same speed, however if you use a 3.4EE CPU then the CPU will be able to push as many Polygons as the video card can render. Thus eliminating the CPU from the equation.
I do agree that Video card reviews should include Benchmarks from various speed computers so you will better know how it will effect your system, but for all out performance, you need to compare video cards true power on the fastest CPU possible.
 
Major_A said:
I do like how [H]ard tests video cards. They take the card and run it at the highest playable settings. I am assuming this way the people who read the article see can get an idea of how it will perform in their systems. I do have one complaint, not very many people have setups like the Test Systems in the articles. In the 800XL round up [H]ard used a P4 3.4EE and an Athlon FX-53. I have not seen anyone here at the [H]ard|Forums with a P4 3.4EE in their sig.
Half Life 2 review -> FX-53
6600 GT review -> FX-53
FarCry Patch 1.3 & SM 3.0 -> P4 3.4EE
6800/Ultra Roundup -> FX-53
Okay you get my point. While these processors would no doubt show the best gaming experience I would venture to guess that the average [H]'er is probably using something in the neighborhood of 3Ghz or 3000+.

We use fast CPUs to get rid of CPU dependency. Still, your cries have been heard and for mainstream cards we are now using an Athlon 64 3500+ and for very high end cards an FX-55.

This is the first published review where we used the A64 3500+ : http://www.hardocp.com/article.html?art=NzA2 There are more reviews yet to be published where we use that CPU as well. Basically all mainstream cards we will use that CPU.

When we review a video card we really don't want them to be CPU limited. Let's take an extreme example and say a game is severely limited by the CPU, then you'd have all the cards we are comparing achieve the highest playable settings because it would be the CPU holding them back. We really don't want that to happen, only with a fast CPU can we see which card offers a better gaming experience. Especially as you move into the really fast cards like the 6800U and X800/X850XT-PE, you need a fast CPU so we can concentrate on video card gaming performance, and not worry about the CPU so much.

Our gameplay evaluation is two fold:

A.) Yes, it allows people with a similar system to see what levels the cards are playable at, that is one goal.

and

B.) This is the one most people overlook. We include cards for comparison with the card we are reviewing. This lets people look and see which cards provided the highest levels of Resolution, AA and AF and the framerate's they provided. So even if you don't have the same system setup as us you can STILL see which card allows a higher level of gaming experience, image quality and consistent framerate's to make an informed buying decision.

So you don' t HAVE to have the same system setup as us to look at the results and figure out which card would give you a better gaming experience.
 
V0ltage said:
The main problem with their article is that not everyone likes to play with the highest possible IQ. Sometimes people don't even have monitors that support those resolutions. In certain games, especially competitive ones in which I have custom configurations, I turn the detail and resolution down. When I want to see a review of a videocard, I like to see raw preformance numbers, none of this "different image qualities, same preformance" nonsense. I won't read them untill they give equal apples-to-apples coverage, or switch it around and make the IQ tests a small part of the review.

you can still see which card is allowing a better gaming experience, just by seeing which one allowed us to play at higher levels

and you can look at the framerate's and see which one was more consistent

for example in a game you may have one card that plays with a consistent framerate, smooth, and you may have another card that while it does allow a high level quality setting, it spikes all over the place

information like that is very important to finding out which cards allow a better gaming experience

you can look at our results and easily tell which cards are "better"
 
I agree. The point of a video card review is to review the video card and not the system. If you have the cpu limiting the GPU then you don't know the true performance of the video card. Like Brent said you can see which one holds constant etc and which one struggles with AA or AF or both. If you lose a low end cpu it could be the cpu that is causing the card to lag with AA or AF and not the GPU itself.

Don't get me wrong, I would like a couple of charts thrown in there for the real life factor, but I think that the review should stay focused on the video card itself.
 
I don't think it would take that long (in fact I know it wouldn't after building many pc's) to throw in a few benches with a mid range machine. A barton of any speed would be perfect.

A 3500+ isn't low end by any means, hell it's not even mid range.

edit: Also could use more settings and resolutions.
 
for a highend card you need a CPU even faster than what they were using (AKA - OC'd FX-55 and OC'd 3.46EE) to show the cards full potential.

this generation of high end cards.... put out the same FPS from resolution to resolution, until you get to 16x12 with high levels of AA/AF.... then the FPS finally start dropping below the CPU bottleneck. (for most games, not all )
 
It has been said and I agree.

They use high end CPUs to avoid CPU bottlenecking. There is no other way to go.
 
One thing worth mentioning here... take midrange or low end cards and run them on high end systems, then run them on low end.
Between the Geforce 4 440MX and the Radeon 9000, the victor is reversed.
9000 has a better fillrate, but the 440MX handles polygons better, on a low end system, the 440MX would perform better as the CPU wouldn't have enough power to push the 9000 to take advantage of the fillrate.

One thing I have noticed in the past few years, a value card in a value system performs like ass, value card in a decent system is ok, good card in a value system is better than both, good card in a good system is the obvious winner, but sometimes this is not an option.
 
Then again, I have an opteron150 (which, AFAIK, is the same box).

I'd like to see them remove any possibility of (lack of) main memory bandwidth getting in the way, or disk IO.

Slap in a dualie opteron 150 in there with a board that supports NUMA, and/or scsi 15krpm disks.

I think the numa is more important, though.

Rob
 
Back
Top