Keep hearing blah blah CPU will bottleneck blah blah Video Card

dragontales

Limp Gawd
Joined
Aug 13, 2002
Messages
404
For instance, of course a P4 1.6 ghz cpu would choke a 7800gtx. I'm guessing a top of the line AMD FX cpu would go great with a 7800gt or 7800gtx. But does anyone know which cpu will be ideal with which video card?

Let's make a nice list:

6600gt is ideal with (enter cpu here)
6800gt is ideal with (enter cpu here)
7800gt is ideal with (enter cpu here)
7800gtx is ideal with (enter cpu here)

x800 is ideal with (enter cpu here)
x800xl is ideal with (enter cpu here)
x800xt is ideal with (enter cpu here)
x850xt is ideal with (enter cpu here)
x1800xl is ideal with (enter cpu here)

To avoid "well it depends on the rest of the system" comments, let's just pretend that the rest of the system is up to par and have enough juice to power the latest and greatest games.
 
cpu's are cheap enough now that you have no excuse not to have a nice one.

Even a 3700+ san diego is $200 now. With a stock cooler you could easily oc that to 2.5ghz +. Any A64 2.2ghz + will be good enough to make you GPU limited at 1600*1200 + AA, even 1280*1024 + AA in most cases.

To clarify, that nice table that you whipped up is useless. The situation we have today is much simpler than the one we had in 1999.

A64 CPU (any S939, preferably 2.2ghz +) + fastest GPU you can afford.
 
For the 7800 GTX, I did some testing on where the CPU increases stop improving score. In essence, where the CPU (AMD, anyway) gets out of the way.

7800 GTX at
490/1300 - 2.50 GHz ( 3Dmark05 - 8948)
500/1350 - 2.60 GHz ( 3Dmark05 - 9147)
520/1420 - 2.75 GHz ( 3Dmark05 - 9430)
540/1420 - 3.00 GHz ( 3Dmark05 - 9743)

So, if you run the GTX stock, you will need less CPU. But, if you want the best match (no OC on either), an FX-55 or X2 4400 will do. If you OC the video card, an FX-57 or X2 4800 are good matches. Otherwise, you will need a single core CPU that OC's to at least 2.5 Ghz for the stock GTX and around 3 GHz for an OC'd card. Take 500MHz off those requirements for a dual core AMD.
 
I wonder about this, because with most games it is seeing my system as the equivalent of a 3Ghz northwood P4. I’ve kept the system just due to the other advantages that dual processors give me. Another thing to consider is that even if you are CPU limited in certain games, you might still benefit from a better videocard. If you were stuck at like 40fps and it was due to being CPU limited, you might be able to crank up the resolution and things like AA and AF and still maintain that 40fps.
 
HeavyH20 said:
7800 GTX at
490/1300 - 2.50 GHz ( 3Dmark05 - 8948)
500/1350 - 2.60 GHz ( 3Dmark05 - 9147)
520/1420 - 2.75 GHz ( 3Dmark05 - 9430)
540/1420 - 3.00 GHz ( 3Dmark05 - 9743)

Wouldn't it be better to leave the GTX stock while you up the CPU speed to see if it makes a difference? Cause atm, we don't know if its the GPU OC or the CPU OC making the biggest difference.
 
HeavyH2O,

What kind of cooling are you using on your GTX to get thet kind of OC out of it?!?!

Edit: Nevermind you have a Leadtek Extreme. Still a good OC though. Hopefully my evga will get a good boost from the NV silencer I ordered.
 
pakotlar said:
cpu's are cheap enough now that you have no excuse not to have a nice one.

Even a 3700+ san diego is $200 now. With a stock cooler you could easily oc that to 2.5ghz +. Any A64 2.2ghz + will be good enough to make you GPU limited at 1600*1200 + AA, even 1280*1024 + AA in most cases.

To clarify, that nice table that you whipped up is useless. The situation we have today is much simpler than the one we had in 1999.

A64 CPU (any S939, preferably 2.2ghz +) + fastest GPU you can afford.

qft. There was a benchmark on here a while ago that showed Venice 3000+ through FX-55 (or so), and the 3000+ was 10 fps (in doom3 I think it was) below the FX-55, and everything else (including the 3200+) was within 3 fps or so.

No reason to spend more than $150 or so on a cpu.
 
I've been reading a lot of people say it doesn't make a difference. In my opinion it makes a lot of difference. Perhaps not as much of a difference with one GTX vs SLI.

In either case I'm gonna settle this once and for all! This weekend I'll run benchmarks from 2000mhz to 3400mhz (Gotta love un-locked multipliers). I plan to run 1600x1200 benchmarks. Most likely with 4xAA SS / 8xAF.

I'll do benchmars with 3dmark05 / Far Cry / Quake 4. I think that should be enough. Since 3dmark05 runs in 1024x768 I'm not sure how relevant the results will be but hey I'll give it a try.
 
btf said:
I've been reading a lot of people say it doesn't make a difference. In my opinion it makes a lot of difference. Perhaps not as much of a difference with one GTX vs SLI.

In either case I'm gonna settle this once and for all! This weekend I'll run benchmarks from 2000mhz to 3400mhz (Gotta love un-locked multipliers). I plan to run 1600x1200 benchmarks. Most likely with 4xAA SS / 8xAF.

I'll do benchmars with 3dmark05 / Far Cry / Quake 4. I think that should be enough. Since 3dmark05 runs in 1024x768 I'm not sure how relevant the results will be but hey I'll give it a try.

What clocks are you planning to use for your GTX's? Are you only going to do SLI benchies? Or single card too?
 
Send me a PM when your results come in, I will stick them. Many people will be interested in them I'm sure.
 
I'll run the tests with 1 GTX and SLI for comparison. I don't plan on running different resolutions though cause that would take me forever. As for the GTX clocks I'll have to stay with stock. One of my cards won't do anything above stock for some reason.

Edit: does anyone have some links to demo's I could use for Far Cry and Quake 4. Is there any standard benchmarking demo's people use? I can always record one and run fraps...
 
it looks like that. Everything with a 7800gt and 2GB of ram (they brought it down to 1GB for some tests)
 
btf said:
Edit: Well they only tested with a 7800GT and in 1024x768. I really think you will see a significant difference with GTX's in sli.

the reason I posted it was to seek opinions on if it is a good way to test cpu bottlenecking
 
needmorecarnitine said:
the reason I posted it was to seek opinions on if it is a good way to test cpu bottlenecking

I'm not sure. Technically running the GT @ 1024x768 should eliminate the gpu bottleneck. Just leaving you with the cpu right? But what happens when you go to 1600x1200 and start using AA/AF. Does this affect the cpu at all? I'm not sure. I don't know enough about the inner working of GPU's to say.
 
btf said:
Does this affect the cpu at all? I'm not sure. I don't know enough about the inner working of GPU's to say.

everything will affect the cpu.

part of their conclusions:

In these testing conditions all CPUs provided more or less acceptable fps rate. Some processors were faster, some were slower, however, in real gameplay with real graphics quality settings any gamer would use all this advantage will disappear. This is because the graphics quality and other gaming settings are usually determined by the graphics card potential. By increasing the quality settings, the fps rate will drop down to 40-60 fps, which is ok for normal gaming experience. And you know, any Pentium 4 CPU with the actual working frequency of 3.0GHz and up and any Athlon 64 with the performance rating of 3000+ and up can process that number of frames per second, as we have already shown in our tests. In other words, in real gaming conditions the performance will still be limited by the graphics processor, and not by the CPU.

I think it is important to accurately define what you are testing

I assume 1600x1200 would look different with those games but maybe not some other ones. There are many variables
 
pakotlar said:
A64 CPU (any S939, preferably 2.2ghz +) + fastest GPU you can afford.

+1

Athlon 64 * is all you need unless you think having 7 FPS less than someone else is 'getting owned'.
 
I don't see ANY point in spending loads of money on a CPU unless you already have.. say 2x 7800GTX. Say you had one 7800GTX and a A-64 3000+ - It would be cheaper to buy a second 7800GTX than an AMD FX CPU and the peformance gain would be MASSIVE compared to the better CPU.
 
btf said:
I'll run the tests with 1 GTX and SLI for comparison. I don't plan on running different resolutions though cause that would take me forever. As for the GTX clocks I'll have to stay with stock. One of my cards won't do anything above stock for some reason.

Edit: does anyone have some links to demo's I could use for Far Cry and Quake 4. Is there any standard benchmarking demo's people use? I can always record one and run fraps...
How about something similar to what they do here at [H]ardOCP? Play through a SP level maybe three times for 5 minutes per CPU speed and record the results with FRAPS. Average them for each CPU setting then post them.
 
its always best to have the best videocard you can afford, period. regardless of what processor you have. for example, I had a 2gz athlon xp paired with a 6800nu, yes, the videocard was bottlenecked, even my ti4200 was bottlenecked, but having that 6800nu was much better than using my ati radeon 9000pro (which wasn't bottlenecked), MUCH better.

so, if you can afford it, and don't mind paying for it, get the gtx, even if you are getting a Athlon 64 3200, its going to perform better and smoother than getting a lesser card for a "better" match up. and when you do get a better processor (or just overclock the damn thing, cuz a 3200 venice core will do 2.4+ quite easily) you'll get free performance from that gtx instead of man, now that i have a new processor, i need to replace this 6800gs (or whatever card you would have chosen for the "better" match up). also, another reason to buy the best you can afford, computer graphics cards are not good values and only last a year or 2, so get the best that you can comfortably get.
 
pakotlar said:
cpu's are cheap enough now that you have no excuse not to have a nice one.

Even a 3700+ san diego is $200 now. With a stock cooler you could easily oc that to 2.5ghz +. Any A64 2.2ghz + will be good enough to make you GPU limited at 1600*1200 + AA, even 1280*1024 + AA in most cases.

To clarify, that nice table that you whipped up is useless. The situation we have today is much simpler than the one we had in 1999.

A64 CPU (any S939, preferably 2.2ghz +) + fastest GPU you can afford.

Who wants san diegos when you can get OPERTOWNZ
 
Was too busy last weekend, I still plan on doing it though. I did manage to do 3dmark05.

I ran these at 200fsb. Ram was at 2-2-2-5. 7800 GTX's were stock.

I should have far cry and quake IV done this weekend. I also plan to put them in charts :)

2000=10058
2200=10720
2400=11366
2600=11998
2800=12549
3000=13126
3200=13542
3400=13867
 
Currently, isn't it true that with most games, higher resolutions with great vid cards make CPU bottlenecking not a factor?
 
Yes...

and im guessing that the testing done above me was done at 1024x768 which means nothing because low resolutions like that are very CPU limited...

Do some real world testing, or of you cant be bothered doing that just up the res in 3DMark to 1600x1200 and put on some AA, Then I bet that you will see less of a difference between the different CPU speeds.
 
pakotlar said:
cpu's are cheap enough now that you have no excuse not to have a nice one.

Even a 3700+ san diego is $200 now. With a stock cooler you could easily oc that to 2.5ghz +. Any A64 2.2ghz + will be good enough to make you GPU limited at 1600*1200 + AA, even 1280*1024 + AA in most cases.

To clarify, that nice table that you whipped up is useless. The situation we have today is much simpler than the one we had in 1999.

A64 CPU (any S939, preferably 2.2ghz +) + fastest GPU you can afford.
The CPU does not restrict a game's playable resolution or graphics options.
 
In these testing conditions all CPUs provided more or less acceptable fps rate. Some processors were faster, some were slower, however, in real gameplay with real graphics quality settings any gamer would use all this advantage will disappear. This is because the graphics quality and other gaming settings are usually determined by the graphics card potential. By increasing the quality settings, the fps rate will drop down to 40-60 fps, which is ok for normal gaming experience. And you know, any Pentium 4 CPU with the actual working frequency of 3.0GHz and up and any Athlon 64 with the performance rating of 3000+ and up can process that number of frames per second, as we have already shown in our tests. In other words, in real gaming conditions the performance will still be limited by the graphics processor, and not by the CPU.

Oh my. That is such a massive leap in logic.

Here's what they forgot. Not everything affecting the performance in your game is directly related to the graphical settings. It's your CPU that calculates the physics and where everything is. When the shit's going down, you better damn well believe that a faster CPU will make a difference.

The testing they did in that article is completely worthless because they use time demos instead of actual gameplay. I'd like to see a [H]ard OCP article on this using their testing methods for video card performance. Take a game like Battlefield 2 with 2gbs of ram and either of the top cards right now, keep the resolution and image quality settings the same, and run the gamut of CPUs on a 64 player server.
 
needmorecarnitine said:

Because with timedemos all information and number crunching the CPU has to do has been removed so its all about how quickly the CPU can send info to the Video card. The fact is in real game play your processor is doing hundreds of other things as well. This can mean major differences with faster CPUs. Thats why Kyle wants to get rid of these Time Demo FPS benchmarking and do gameplay ones, and why a dedicated PPU can unleash lots and lots of performance and realism.
 
biohazard_nz said:
Yes...

and im guessing that the testing done above me was done at 1024x768 which means nothing because low resolutions like that are very CPU limited...

Do some real world testing, or of you cant be bothered doing that just up the res in 3DMark to 1600x1200 and put on some AA, Then I bet that you will see less of a difference between the different CPU speeds.

You should actually read what I wrote before you pass judgement. I even said that I don't know what 3dmark05 would tell us since it's 1024x768 but I would do it anyway. As for 1600x1200 you need to pay to do that. And I ain't paying a single penny for a benchmarking tool...

I suggest you read the thread before posting... I even say in my last post I'm going to do Quake 4 and Far Cry...
 
blaze24 said:


Interesting article looks like I won't need to do single GTX :)

I'm more interested in SLI then anything. One thing that worries me about benchmarking though is that with demo's the AI factor is taken out of games and probably some other stuff too. Which is obviously cpu intensive. How much of a factor is this? I wonder if you can do "real" benhcmarks.
 
btf said:
Interesting article looks like I won't need to do single GTX :)

I'm more interested in SLI then anything. One thing that worries me about benchmarking though is that with demo's the AI factor is taken out of games and probably some other stuff too. Which is obviously cpu intensive. How much of a factor is this? I wonder if you can do "real" benhcmarks.

You might want to try that one demo that out that [H] linked do a few weeks ago. I forget what it is, dreadnaut or something like that. it has a built in god mode hotkey in the demo. That way you won't die. That is if you feel up to "real world" benchies.

I think timedemos will be about enough though. Can't wait.
 
Ok so I ran some timedemo's with Far Cry. I started seeing little difference between each speed. Like barley 1fps between 3400 and 3000. So now I'm really begining to think timedemo's don't work well.

The difference from 3400 to 2000 was 65.83fps vs 59.67fps. A difference of about 6fps or 9%. This doesn't sound right at all. So in an effort to show this I loaded up an outdoor map and stood in the exact same spot looking out. I loaded the map several times and always had the same result. Is this exact? Probably not but close enough for me.

At 3400mhz I get 51fps. At 2000mhz I get 37fps. A difference of 17fps or 28%.

Correct me if I'm wrong here but it would seem like running timedemo's for bechmarking is pretty useless. The difference between 60 and 65 is not much. But the difference between 37 and 51 does mean playable or not for me.

Btw these benchmarks were run @ 1600x1200 4xAA SS/16AF. Far Cry was set to ultra high quality.

Edit: Interesting thing here is that my comparison is showing about a 28% difference. And 3dmark05 is showing the same thing 28%...
 
Back
Top