Effects of CPU cache size on total system performance?

SEALTeamSix

2[H]4U
Joined
Aug 15, 2003
Messages
2,434
Hi all,

This is a continuation of the discussion found in this thread... so what are the real effects of CPU cache size on total system performance? Does it scale up or down as you overclock?
 
Maybe I wasn't clear enough - you find the max overclock of a processor using the highest multiplier, then adjust it down and bump the FSB up. That way, you end up with the highest system FSB speed as well as the highest CPU overclock. AFAIK, that's the whole point of downward-unlocked multipliers on CPU's.

As for performance decreases due to cache - if you've got a link to benchmarks showing a different setup with worse performance in a real-world setting due to CPU cache, I'd love to be proven wrong. Seriously, I love having the best information, and I'll admit I don't always get things right.

We are getting a bit off-topic, though - perhaps we could move this discussion to a different section? I've made this thread to try to help this thread stay on topic.

Sorry I didn't read your part about the downward multiplier. Higher FSB does give you better performance though.

Either way most E4x00 doesn't overclock as well as E6600.

As for cache I'm still looking for a article but couldn't find one yet where they test at lower resolutions instead of maxing out at 1600x1200. It would depend on the game but the cache does make a bigger difference when the video card isn't limited.

I wish somebody with a E4300 and E6600 can overclock to same fsb and show us some results on games and applications where the video card isn't an issue.
 
cache does make differences in benching programs, real life proformances? Not much.
 
Sorry I didn't read your part about the downward multiplier. Higher FSB does give you better performance though.

Either way most E4x00 doesn't overclock as well as E6600.

As for cache I'm still looking for a article but couldn't find one yet where they test at lower resolutions instead of maxing out at 1600x1200. It would depend on the game but the cache does make a bigger difference when the video card isn't limited.

I wish somebody with a E4300 and E6600 can overclock to same fsb and show us some results on games and applications where the video card isn't an issue.

Agreed, higher FSB is better.

As far as OC an Allendale core versus a Conroe core, here's some data: LINK. It does appear that the Conroe wins by about 200-300 MHz, which is roughly consistent with the OC data found elsewhere (for instance, the OC database on this forum). However, overclocking is also very dependent on the user's skill, and also just plain luck. Who knows whether users really get the max OC from their processors. However, based on the limited data available, it does look like there's a small difference in OC-ability comparing Allendale and Conroe. Now, whether this is meaningful in a real-world setting is a good question.

And speaking of gaming, would you really play at 1280x1024 resolution with an 8800gtx? I mean, I think if you have a $500 video card, you'd want to see the eye candy... it just seems doesn't seem like a realistic setting to me. [H]ard|OCP ran an article recently which showed that once you get past a certain CPU speed, your real-world gaming performance is limited by your GPU instead. I also think anyone with an overclocked C2D processor doing any gaming at all will have a decent video card to match it.

I keep emphasizing real-world settings, because of course there is a difference in L2 size and perhaps a marginally higher OC, so of course given the right benchmark you can make it show up. However, I think the better question is if the difference will actually show in any real-world scenarios.
 
[H]ard|OCP ran an article recently which showed that once you get past a certain CPU speed, your real-world gaming performance is limited by your GPU instead. .

Yup, no gains in 3d06 synthetic past 3.2 Ghz or so with my 8800 GTS. Although I believe FSX is rather CPU dependant.

Expanding on FSB in the other thread...I'm kinda a noob when it comes to benchmarking games, but I could do a couple benches between 423x9 and 475x8 with a 6600 should anyone have a time demo they could recommend to me, I'm kinda curious myself.
 
And speaking of gaming, would you really play at 1280x1024 resolution with an 8800gtx? I mean, I think if you have a $500 video card, you'd want to see the eye candy... it just seems doesn't seem like a realistic setting to me. [H]ard|OCP ran an article recently which showed that once you get past a certain CPU speed, your real-world gaming performance is limited by your GPU instead. I also think anyone with an overclocked C2D processor doing any gaming at all will have a decent video card to match it.

We were trying to figure out what the difference in performance between cache when the video card isn't limited. Of course I do get your point of having high end video cards to play low resolutions. Then what is the point of overclocking your cpu to 3.4ghz when 2.2ghz will give you same frame rates when you are video card limited at these high resolutions? E Penis? :D
 
Here I found a benchmark. using a 8800gtx @ 1280x1024 resolution.

http://channel.tomshardware.com/2007/08/25/cpu_charts_channel/page14.html#prey

Notice E6300/6320 and E6400/6420.

E6300=87.3fps
E6320=94.1fps
With E63x0 series seems like 7.3% increase in Prey

E6400=97.1fps
E6420=102fps
5% increase here.

You can see more benchmarks here.

http://channel.tomshardware.com/2007/08/25/cpu_charts_channel/page14.html#prey

It's roughly 5-10% in games. It just depends on the situation.

Did your bird tell you that E6600 is at 1066 fsb and E4400 is at 800 fsb with half the cache? That equates to roughly 20-25% speed difference at same clock speed.
Still looking for your source of 20-25% increase in performance at the same clockspeed.
 
Still looking for your source of 20-25% increase in performance at the same clockspeed.

If you look at the benchmarks in the link I gave and compare E4300 vs E6320 you can get a rough idea how FSB and Cache makes a difference.
 
400 x 9 and 450 x 8 both score the same CPU score in 3dmark, superpi and aquamark for me... Q6600 G0 (within a .5% tollerance level)
 
400 x 9 and 450 x 8 both score the same CPU score in 3dmark, superpi and aquamark for me... Q6600 G0 (within a .5% tollerance level)

i remember some guy who posted benchmarks of his q6600 at 333*9, 375*8 and 479*7, and got identical results. cant seem to find it tho.
 
i remember some guy who posted benchmarks of his q6600 at 333*9, 375*8 and 479*7, and got identical results. cant seem to find it tho.

This is fairly normal. We're not using busses that are starved for RAM bandwidth these days, and therefore, there's no real loss of processor efficacy by changing around either setting, so long as the final frequency stays the same.
 
Look robot who need to repeat exact same thing to troll. Look at the benches and make your own conclusions. particularly what Abu Som3a posted.

http://www.legionhardware.com/Bench/Intel_Conroe_Cache_Performance/Gaming_02.png

Looks like 20% to me just using cache and no fsb advantage. :rolleyes:

It's about 12% with that test, between 1mb and 4mb of cache.

Although, I must question it's validity considering no other sites report the same difference. General consensus dictates that doubling the cache increases performance by around 5% in calculation intensive programs.
 
It's about 12% with that test, between 1mb and 4mb of cache.

Although, I must question it's validity considering no other sites report the same difference. General consensus dictates that doubling the cache increases performance by around 5% in calculation intensive programs.

Are you not seeing the same results?

E4300 266x9=2.4ghz----->113.4fps
E6600 266x0=2.4ghz------>134.7fps

Okay fine 19%. If you add in the FSB differences surely it can be 20-25%.

This site was using a 8800gtx when the video card isn't too limited while other sites use some ridiculous higher resolutions with 1900xt. That's the difference.
 
It's about 12% with that test, between 1mb and 4mb of cache.

Although, I must question it's validity considering no other sites report the same difference. General consensus dictates that doubling the cache increases performance by around 5% in calculation intensive programs.

maybe thats because review sites compare the cpus at stock speeds. the e2160 gives very similar performance to e4300 at 1.8ghz. at higher frequencies however, the difference becomes much more significant.
 
We were trying to figure out what the difference in performance between cache when the video card isn't limited. Of course I do get your point of having high end video cards to play low resolutions. Then what is the point of overclocking your cpu to 3.4ghz when 2.2ghz will give you same frame rates when you are video card limited at these high resolutions? E Penis? :D

... hence my recommendation of the E4400 as an alternative to the sold-out E6600's in the other thread.

If you are a gamer (and that's basically what we're talking about), you aren't going to find many (if any) real-world scenarios where an E4400 performs 20% worse than an E6600. And if you're into CAD, video editing, etc. where you might actually see a difference, you're most likely going for a quad-core anyway.
 
the difference is 5-10% according to the application, which can be compensated by a ~200mhz clock increase. the 20% difference marvelous pointed to is an anomaly which wasnt seen again in the review or any other hardware site.
 
... hence my recommendation of the E4400 as an alternative to the sold-out E6600's in the other thread.

If you are a gamer (and that's basically what we're talking about), you aren't going to find many (if any) real-world scenarios where an E4400 performs 20% worse than an E6600. And if you're into CAD, video editing, etc. where you might actually see a difference, you're most likely going for a quad-core anyway.

Maybe you should look at this again.

http://www.legionhardware.com/Bench/Intel_Conroe_Cache_Performance/Gaming_02.png

that's real world difference.

I do understand people with gtx aren't going to be playing at this resolution but cache and fsb can make quite a bit of difference if you aren't video card limited. I think I proved my original point.
 
Maybe you should look at this again.

http://www.legionhardware.com/Bench/Intel_Conroe_Cache_Performance/Gaming_02.png

that's real world difference.

Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.

Please contact the server administrator, [email protected] and inform them of the time the error occurred, and anything you might have done that may have caused the error.

More information about this error may be available in the server error log.

Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
Apache/1.3.37 Server at legionhardware.com Port 80

error
 
An extra 2MB of cache is equal to about 100-150Mhz of core clock speed, if you want to look at it that way.
 
From these gaming benchmarks:


E4300 @ 1600x1200:

PREY: 155
Far Cry: 117
COH: 100
X3: Reunion: 75
Average Frames: 111.75 FPS


E6600 @ 1600x1200:

PREY: 160
Far Cry: 135
COH: 96
X3: Reunion: 86
Average Frames: 119.25 FPS

Let's just suppose that this is an example of real-world use - there's nobody in the world that can tell the 6.7% difference between 112 FPS and 119 FPS. The framerates only go higher at 1280x1024, and as you turn on eye candy the framerates just get bottlenecked by the graphics card.

I'm not trying to bring you down, but you seem to be implying that you would see a real-world difference between an E6600 and an E4400 in gaming when that's really not possible.
 
From these gaming benchmarks:


E4300 @ 1600x1200:

PREY: 155
Far Cry: 117
COH: 100
X3: Reunion: 75
Average Frames: 111.75 FPS


E6600 @ 1600x1200:

PREY: 160
Far Cry: 135
COH: 96
X3: Reunion: 86
Average Frames: 119.25 FPS

Let's just suppose that this is an example of real-world use - there's nobody in the world that can tell the 6.7% difference between 112 FPS and 119 FPS. The framerates only go higher at 1280x1024, and as you turn on eye candy the framerates just get bottlenecked by the graphics card.

I'm not trying to bring you down, but you seem to be implying that you would see a real-world difference between an E6600 and an E4400 in gaming when that's really not possible.

Now you are averaging the total fps?

Either way I think I proved my point that in situations you can surely get 20% better performance.

By your logic why have faster CPU? Why buy a conroe? When cheap $ 60 overclocked x2 processors can do exact same thing.,
 
First of all, let's just say that in most cases, you will never notice any difference between a 2mb or a 4mb CPU. Also, the difference in FSB at the same clockspeed will result in 0-1% performance difference.

Trust me. FSB increase doesn't do jackshit to worth mentioning.

The reason I kept on asking you the same question is because you said there's a 20-25% performance increase across all applications. Check what you wrote again because that's flat out laughable.

The only thing you proved is that you can barely get 20% increase even in extremely rare and isolated cases.
 
First of all, let's just say that in most cases, you will never notice any difference between a 2mb or a 4mb CPU. Also, the difference in FSB at the same clockspeed will result in 0-1% performance difference.

Trust me. FSB increase doesn't do jackshit to worth mentioning.

The reason I kept on asking you the same question is because you said there's a 20-25% performance increase across all applications. Check what you wrote again because that's flat out laughable.

The only thing you proved is that you can barely get 20% increase even in extremely rare and isolated cases.

You are a moron because that's exactly what I posted on the other thread. :rolleyes:
When graphic cards aren't limited it's not 1 or 2%.
 
Now you are averaging the total fps?

Either way I think I proved my point that in situations you can surely get 20% better performance.

That's right, I took the average of the 4 game benchmarks to condense the data... but if you would prefer, I can break it out individually:

PREY Benchmark:
E6600 @ 1600x1200: 160
E4300 @ 1600x1200: 155

Nobody on God's green earth can tell the difference between 160 and 155 fps. Plus, nobody in their right mind would run FarCry like this, unless to brag about how their CPU is 3.2% faster than a CPU with 2mb less L2 cache...

FarCry Benchmark:
E6600 @ 1600x1200: 135
E4300 @ 1600x1200: 117

It's completely impossible to tell the difference between 135 fps and 117 fps. It's a completely unrealistic scenario to begin with, which is why H]ard|OCP stopped doing this type of worthless benchmarking years ago...

Company of Heroes Benchmark:
E6600 @ 1600x1200: 100
E4300 @ 1600x1200: 96

Hey wait, if I look closely while I'm playing I think I can actually see some difference in these framerates... wait, no I can't. Crap.

X3 Reunion Benchmark:
E6600 @ 1600x1200: 86
E4300 @ 1600x1200: 75

Wow, that's an astonishing 14% difference... now if I just overclock my retinas and volt-mod my pupils, I'll be able to see it.... aww fuck it, you get the point already. But yeah, you've totally proved your point, too - in completely and utterly unrealistic cases, you can force a 20% difference in framerates based solely on the size of a CPU's L2 cache. Rock on.

Marvelous said:
By your logic why have faster CPU? Why buy a conroe? When cheap $ 60 overclocked x2 processors can do exact same thing.

Maybe that's why I've got a cheap $60 overclocked x2 processor in my box? Let me just double check... yep, that's the reason.
 
Blah blah. Now you are trying to justify yourself why you have a $60 processor and being stupid I might add. :eek:

My original point was proven of what I said. I don't need to waste my time with you any longer.
 
Blah blah. Now you are trying to justify yourself why you have a $60 processor and being stupid I might add. :eek:

My original point was proven of what I said. I don't need to waste my time with you any longer.

Calling people "moron", "stupid", and a "waste [of] my time" is sure a good way to make a point... maybe not the point you were trying to make, but whatever. :rolleyes:

You win at the internet!
 
volt-mod my pupils

If you don't think this isn't stupid I don't know what is. :rolleyes:


You show me year old results with 1900xt crossfire @ 1600x1200. Do you think maybe if they used a different video card they would have better performance in games or no? Lower resolutions? No or yes? :rolleyes:

When a person misunderstood what I posted on the other thread and attacking me on a different merit. I think in those situations the english language calls for a word like "moron". :eek:

Yup I proved my point but you want to prove yours when I didn't ask. That last comment. Yeah I win. You lost. Get over it. It happens when someone actually has logic and evidence.:eek:
 
Marvelous said:
That equates to roughly 20-25% speed difference at same clock speed.

This means there will be a 20-25% performance increase across all applications. Every other person posting in this thread (and the other one, too) disagrees with you, and benchmark after benchmark have been linked showing a 5-10% difference in performance in a wide variety of apps. You found 2 examples of anything close to a 20% difference, and declare that you proved your point. Sorry, but your point is not proved.

Then there's the other issue... you are personally attacking and insulting other members, when we're all trying to have a civil discussion about a neutral topic. Once you opened your reply to SentToSchool by calling him a moron, I got a little sarcastic (which you apparently can't distinguish from stupidity), but look around - nobody in this thread has called you names or insulted you. So, why can't you be civil too?
 
No it doesn't mean it's going to run 25% across all applications. It means it can get upto 25% at times when it's not limited by GPU or whatever.

Isn't this the reason I later said this. You show me year old results with 1900xt crossfire @ 1600x1200. Do you think maybe if they used a different video card they would have better performance in games or no? Lower resolutions? No or yes?

But you want to put it into a situation where the video card becomes limited. At this point you are not testing the CPU. You are testing the GPU in conjunction with the CPU which is whole another subject. :rolleyes:
 
Sentoschool was trolling or behaving like a 10 year old asking mom for $5 so he can buy candy. He had nothing to input except try to get on people's nerves asking the same thing over and over again. I am civil just because I use words like stupid or moron doesn't mean I am not. Sometimes these words fit the bill. :rolleyes:

Volt mod my pupils? I guess so. If you aren't stupid I don't know how you are going to volt mod your pupils. Maybe you want to be stick 10000volts in to your pupils. I've seen stupid things happen.
 
I'm not going to debate with you over what is or isn't civil behavior, or what the difference is between sarcasm and stupidity. I see the purpose of this thread as having been satisfied, and it has devolved into pettiness. Because of this, I'm getting a mod to close this thread.

I hope this thread will be at least mildly informative to those who come across it in the future.
 
Back
Top