CPUs & Real-world Gameplay Scaling

I like to see articles like this that remind me that my 2.0 A64 still can hold its own in the world of high res gaming and with all the new games. It seems like as every weeks goes by my 7800GT and my 3200+ seem more and more obsolete. Its nice to know that they can still hold there own.
 
Thanks for the informative review. However, with a title like "Real-world gameplay scaling", I was expecting more real-world system choices. Most of us do not live in the dual core world yet. The sole AMD single core CPU choice was the $1000 Athlon FX 60, with the slowest performer of the bunch an X2 3800+.

As a consumer, I am more interested in a mix of the latest generation and last year's generation, not three of the latest CPUs, the cheapest of which sells for roughly $300 today. I need to know if last year's model will no longer cut it, as well as seeing data from the newest models. For reference, I am running on an A64 3000+ that I purchased about a year ago when it was a very popular price point, and fully expect it to last me at least another year. I understand that your sponsors will want you to demonstrate the newest hardware, and that you may consider an X2 3800+ to be slow, but those of us with slower machine are probably running single core, and are interested in single core data. The Intel consumers were given an older single core Pentium 4 for comparison.

Again, thanks for the informative article, and please also provide older data for us, so that we know whether it has become time to upgrade. This article will become useful to me next year, not this year.
 
Maybe you could do a mini followup to this article using Oblivion, now that it's out? According to what i've read from the devs, there is a lot of multi-threading going on in the code because of the Xbox 360 support.
 
wow Ive been looming around here- trying to figure out if dual core is necc. I am also buying a new pc . This was a good article and showed me that theres not much difference- now does this also go for the opteron dual core say the 170? well that might be better for another post- but hey I just wanna play games and I think single core is just fine for me- spend the rest on a great card.
 
A few months ago I swapped my 7800GT into the Pentium D based Dell I bought my wife and (after a clean install and lots of tweaking) came to a very similair conclusion, besides no real dual core advantage I was getting a lot of weird behavior across several games regardless of how many patches and hotfixes I applied. This lead me to upgrade the CPU and GPU in my gaming PC and I can say I have no regrets.

I have no doubt that we will eventually reap some gains from dual core CPUs in games, but i don't think it will be this year, and likely not most of next year. Top that off with many gamers playing at higher resolutions and I don't think we'll see big gains from multiple cores until late in 2007, and we'll probably still be GPU bound.

Great read, thanks for the effort guys!
 
Kyle and co.

Thank you. This is the FIRST and ONLY place online that I have been able to really get solid, real world answers to my questions.

This article simply kicks azz. Period. It could not have been more straight forward or to the point.

How this relates to me? I stand at a cross roads hardware wise. I have an older system that I am gaming with that is starting to show its grey hair. I know this is [H] OCP not [M] (Mediocre) OCP but I am hesitant to upgrade at this time until Vista hits the big time and has stabilized with its hardware requirements. The burning question? - Do I need to jump now? Or Jump can I afford to wait for later?

Dual core systems are obviously the way of the future, but is that future a reality and more importantly a necessity now?

My current rig is as follows:

Abit IC7-Max3 w/ Prescott 3.0E core overclocked to 3.75Mhz.
1Gig Corsair 4400 RAM running at 250FSB. (core and mem 1:1 ratio)
ATI X800 Pro overclocked to XT levels.
Audigy 2 ZS Pro
KOOLANCE PC3-426BK - cooling CPU, NB & Video + Ram sinks
Dell 2405FPW

This system has served me very well over the past year and a half. It is only recently with the release of F.E.A.R that I really hit my first wall with performance related issues. (Bear in mind too I'm gaming at 1920x1200 for all of my games - I mean how could you NOT?) I simply did not feel comfortable shelling out the green right now when there is so much change in the coming months with AMD's going to DDR2 and Vista with its all over the place hardware requirements.

The real questions I had were:

1) Is Dual Core essential for games right now? Am I missing the boat other than PCI-E?
2) Will I really get my 'bang for the buck' by dropping major $$$ on doing a complete overhaul to the new tech now. (Unfortunately for me my next upgrade is major due to my current config)
3) Is my current rig capable of getting me realistically to the end of the year and keeping me 'in the game'?

This article knocked all those answers out of the park for me. My take on this article is this - Dual Core is great for games but not 'MUST' have right now. If you can wait - wait unless you have to be on the bleeding edge. (Don't get me wrong, I've been there before at times and I'm sure will be back again but new house payments are in the mix at this point in life.) :) Those of us still holding back in [M] land can still game with the best of them granted that our hair will not be on fire or anything from being on the edge.

From what I see here, only my lack of access to PCI-E is hurting my gaming for the immediate future. I plan on helping this with a smaller intern upgrade to the BFG 7800 GS AGP or with perhaps with a soft mod upgrade to the X800 Pro to push it to XT PE levels. (IF she can take it - the fun part is in the trying!) Either way these solutions should tide me over until the next 'big thing' without having to commit RIGHT now.

This is EXACTLY what I needed to hear at this time. Thanks again Kyle and co. at [H] OCP. You guys are the reason I stay so committed to doing what I do for fun because of great people like you lighting the way through the marketing bullshit and PR hype.

I simply cannot tell you guys enough. Keep doing it [H] style!

Loyal reader,

Scott :D
(Sorry about the long wind - it had to be said)
 
I just want to throw in my $.02 that this was the one of my top 3 favorite articles at the [H]. Great info and much appreciated indeed!!!
 
F.E.A.R. looks like it's entirely GPU dependant in every AMD benchmark you ran for it.

only a 2-3 frame difference from the fastest to the slowest, on average with only a little higher max FPS, not that it really means anything.

Especially with the dual CPU it would've been nice to see you drop the cpu speed till it started to hurt the frame rate.

I mean would a 1.8 A64 still pull in 48fps, what about a 1.6?
So if a 1.6 was CPU limited then would dual 1.6's yeild an improvement?

Granted I suppose now we are talking theoritcal and not real world. I suppose realworld what is taken from the review is modern game engines love GPU speed and don't need much CPU at all.
 
great job boys!

i've been looking for this article for 4 years!!!!!!!!!!!!!!!!!!!!!!!!!!!! that's about how long i've been with SMP (still got the MPX chipset+dual mp2800).

the key to SMP isn't gaming ...yet (as you noted). the key is multitasking and rendering improvements. it's the silky smooth response of the UI that make people convert.

another reason why i say i've been looking for the article is to have information that helps gamers&users make informed decisions. you give AMD 2.4ghz and Intel 3.2ghz as the point of threshold/optimal balance between performance and value. AFAIK, there hasn't been ANY journal/enthusiast site that has achieved that advice by the way of REAL hardcore data as you've done. there's nothing more scientific than to do actual/practical tests. it's a practical/REAL WORLD application that is really useful.

think about it. say i'm upgrading next year (and i will be due to the aging chipset i have), i gotta take into account bang/buck. now i have REAL numbers to go with. now i know that i AT LEAST get a 2.4ghz AMD or 3.2ghz Intel no matter what =). anything beyond that is gravy. now THAT is what i call good information!

this is the reason i keep coming back to [h]! that and the occasional gay p0rn references from Steve :D
 
Yeah I appreciate the article very much since I'm preparing to build a new system. I was planning on getting an AMD X2 proc, but now looking closer at operating frequency. I'm not a big multi-tasker so a single core proc may be the ticket. My main concern is making sure the new rig lasts at least 3 years of good gaming.
So from what I gather if I go single core AMD64 and keep the operating frequency above say 2.4 GHz or better yet 2.6Ghz, then I should have a pretty decent rig as long as I have a video card to match the system.

I guess the big question is how much developers will write games with X2 in mind in the next year or two or 3. Chances are slim I would think since a lot of games can take 2-4 years to make.
 
Great work Kyle and Brent, much appreciated, I was really looking for something like this but I haven't seen anything this good unitl now. Keep it up! :) :cool:
 
Very very nice article.

I've noticed a big difference in how my PC runs with high-end games after going from my (Single Core) AMD 64 3000+ to my (Dual Core) AMD 64 Opty 170.

My whole PC was much smoother during the games (FEAR, for example), and when the framerate dropped a bit it recovered very quickly.

A cpu will always make some kind of a difference. The calculations of a game do not just go through the gpu, because not everything is visuals.
 
Great read. This had to have been a ton of work!

I must say that I found myself perturbed with the CPU choices as others have mentioned until I got to the conclusion.

At first as I was reading I thought "how the hell do you a CPU scaling article when the low end (AMD) is a 3800 X2". Based on the conclusion that no real world benefit can be enjoyed in gaming with a dual core CPU, in theory, the 3800 X2 marks should be on par with a 3200 as they are both 2 Ghz AMD parts. This gives creedance to the moniker of a CPU scaling article.

The thing is that there is almost a $150 difference between a 3200 and a 3800 X2 and I think a lot of us (at least initally) see the CPU selections and think primarily in terms of cost. Maybe future CPU selections could give greater respect to pricepoint.

That said, the lack of a lower end CPU such as the 3200 is okay based on the dual vs. single core findings and the 3800 X2 being present.

Once again, thanks for the great reference material!
 
Circuitbreaker8 said:
Why did I spend so much money on my opteron :( lol


Because it f**king rocks at everything......just wait.....you'll be glad you future-proofed your PC so well....
 
Khaydarin said:
Because it f**king rocks at everything......just wait.....you'll be glad you future-proofed your PC so well....

There is your subject matter for the next exhaustive research article. :D

After building gaming machines for going on 10 yrs or so now, I just don't buy into "future proofing" anymore. Too many variables change. I'm a firm believer in building a machine to the specs you need today and upgrade if you can later or build again new. I do not care to pay the premium for unused FPS today for a maybe tomorrow.

Maybe NewEgg could query their data to give monthly price data over some years on selected pieces. Pick some points in time and massage the data to see if future proofed machines worked out better over time than built for today machines. Especially with your great For Sale/For Trade forum you can really get a good chunk back on your used parts. Today's $500 card is next years $200 resale, or today's $250 card is next years 150$ resale. You figure which you lost more money on. :D

/me is getting more frugal as the years roll by
 
FSCDiablo said:
There is your subject matter for the next exhaustive research article. :D

After building gaming machines for going on 10 yrs or so now, I just don't buy into "future proofing" anymore. Too many variables change. I'm a firm believer in building a machine to the specs you need today and upgrade if you can later or build again new. I do not care to pay the premium for unused FPS today for a maybe tomorrow.

Maybe NewEgg could query their data to give monthly price data over some years on selected pieces. Pick some points in time and massage the data to see if future proofed machines worked out better over time than built for today machines. Especially with your great For Sale/For Trade forum you can really get a good chunk back on your used parts. Today's $500 card is next years $200 resale, or today's $250 card is next years 150$ resale. You figure which you lost more money on. :D

/me is getting more frugal as the years roll by


Thats more than understandable. But I have found alot of uses with my Opty 170. Everything is faster and smoother, not to mention the infamous ease of multitasking. I dont have the money to keep upgrading "when I need/want to," so I have to plan a bit ahead anyways when I upgrade. Future-proofing is subjective to an extent, yes, but generally its somewhat practical.
 
...this just in from Mark Rein. (From the FS interview here:http://www.firingsquad.com/features/epic_games_rein_interview/page2.asp)

Both Intel and AMD now sell dual-core processors. Apple has switched to Intel. NVIDIA’s SLI is taking hold and lots of game enthusiasts are starting to use it. These are all good things that Unreal Engine 3 is very qualified to take advantage of.

So like I mentioned, the gaming landscape is about to change...and I think from where I stand right now, pretty dramatically. So you gamers that have invested in your future with the X2's and Intel dual cores...I wouldn't sweat it...even on a secondary box (if you upgrade as frequently as I do) it will be worthwhile to have that dual core...

hifi
 
Very nice article - good to know my CPU isnt behind with the times just yet.

Just one thing though, in the conclusions, isnt the phrase "dyed" and not "died" -in-the-wool?
 
After a little more home work and discussion with others and despite what the article says I went ahead and ordered a dual core 4600+.

I read a comment elsewhere that some games are currently making use of dual core including Quake 4 and Serious Sam 2. It seems to me it's only a matter of time before other programs along with games will make use of the technology. And since I just bought a new cpu I want it to work well for a few years, not just the next 6 months or year from now.

The article was definitely very helpful in making my decision so I appreciate the time you folks spent on it.
 
Khaydarin said:
I've noticed a big difference in how my PC runs with high-end games after going from my (Single Core) AMD 64 3000+ to my (Dual Core) AMD 64 Opty 170.

My whole PC was much smoother during the games (FEAR, for example), and when the framerate dropped a bit it recovered very quickly.

Wait a minute...are you saying your single core ran better than your dual-core? That is what I gather from your second sentence there...
 
THANK YOU FOR THIS ARTICLE Brent & Kyle
We have had many articles about this gpu/vpu or that cpu and heard so much bs about DC systems. What I read confirmed what I have been thinking in my head for the past year. DC systems are there for multitasking, DC doesn't have much effect on current generation games unless you usually heavily multitask in the background while gaming (but we still haven't seen much info regarding that) - I.E. burning DVD's, CD's - have other processor intensive things going on in the background - reencoding a dvd or something.

I have often thought what would happen if you took a DC system and a single core system, put them side by side clock per clock and compared them. I am glad you have done this, and you provide the answers many people are looking for (provided they interpreted the data and what you wrote about the various setups).

Thank you! :D
 
Conroe at 2.66 shouldl change that review completely. Only time will tell now as I had issues with the pre-review...not getting into now but I want a review of the 2.66 with a NONE ES CPU with a none modified 975 chipset MB from Asus...this time. Any word yet on a NONE ES version of a 2.66 with maybe a P5WD2-E Prem??
 
scaryogre said:
Wait a minute...are you saying your single core ran better than your dual-core? That is what I gather from your second sentence there...


Oh....no no no.....

In FEAR the framerate obviously jumps around - for example going from a quiet hallway to an intense firefight. My dual-core recovers those lost frames much faster then my single core; leading to an overall smoother experience.

Sorry for the misleading phrasing there.
 
Gotta love FireFoxe's search, makes looking for a topic a lot easier.


To the point! What, if any, does this have to do with Hyperthreading? I mean do you know if HT exhibits the same loss of motion right now? I know that multiple processors don't actually mean you are going faster but that you have more throughput, and that the only real speed increases would be seen in devision of labor but I was just wondering if there was (somehow) a difference with HT.


Also just to drop something in the basket if you are thinking about a follow up could you also explore this and the effects of HT on a lower procesor? (I am aware that HT is not truely two threads at a time.)
 
I just read the article today. This is one of the better articles I've seen comparing single and dual core but it lacks what I would call real world. I sell, build, and repair computers for a living. In all the years I have never seen customers computers that don't have things like Anti-virus, MSN messanger, Anti-spyware, popup killer, and other CPU robing programs running in the back ground. If you want to do real world testing I think you need to have those type of programs running. I think in that environment the Dual core should be alot faster.

The articles that I see are always done on fresh installs, with the most current updates and drivers. It's just not real world.

Ron :(
 
yes this seems to confirm my observations on gpu/cpu use with games. :cool:

for the time being at least a powerful single core cpu with a high end vga still has alot of life in it yet.

a single core overclocked opty is a very good path when combined with a powerful gpu.

the DC's otherwise known as x2's time will come but there is going to have to be alot of work done with drivers and OS to get that right and it may take a while . :eek:
overall a good read and useful info--
 
I have to sit myself firmly in the dual core camp, although I do know a person or two who think a PC is a gaming machine with surfing, Email and IM, for most of us it is far more than that. I recently upgraded my main PC from a P4 Northwood 3.2 @3.6 to an Opteron 165 @ 2.4, the upgrade meant my HTPC got a "sideways" grade from an A64 3200+ in an Asus K8V to the P4 3.2 in an Asus P4C800E Deluxe, as far as I am concerned the P4 3.2 is nearly night and day better at everything but gaming than the single core A64. Kyle has always been one of the very few that has pointed out that there is an advantage to Hyperthreading for much real world use and my experience with a 3200+ and a P4 3.2 certainly bears that out. I would recommend to anyone but a truly obsessed gamer to go for an X2 3800+ or an Opteron 165 (for the overclocker) or even a fast something P4 rather than saving a few bucks by getting a single core A64.

I would compare the P4 3.2 Northwood to the Opteron 165 but the ease of a 600MHz overclock and knowing the Opteron has a lot more left in it, my Asrock 939 Dual SATA2 has a 274 limit and am running at 267 with the RAM at 1/1 for now, has me so in hardware lust that I can't really comment on the thing without getting silly. :D

Great Article!
 
Good read...interesting results. Seems to me that, like others, I have a good enough CPU for today's games and video cards, but am limited by the fact my Mobo does not support PCI-E. I am getting a stable 4ghz on my Prescott/IC7 combo but l am pretty much limited to AGP and old video cards.

Is there enough bandwidth (theoretically) on the AGP 8x bus for these new video cards? If so, that almost suggests a conspiracy as I am sure there are many people who would buy new video cards in the AGP 'flavour', if they only could. Enough people to make offering AGP alternatives profitable anyway. I am interested in comments on this...maybe I am missing something.
 
Herbus said:
Good read...interesting results. Seems to me that, like others, I have a good enough CPU for today's games and video cards, but am limited by the fact my Mobo does not support PCI-E. I am getting a stable 4ghz on my Prescott/IC7 combo but l am pretty much limited to AGP and old video cards.

Is there enough bandwidth (theoretically) on the AGP 8x bus for these new video cards? If so, that almost suggests a conspiracy as I am sure there are many people who would buy new video cards in the AGP 'flavour', if they only could. Enough people to make offering AGP alternatives profitable anyway. I am interested in comments on this...maybe I am missing something.

no you are not missing anything. i know at least up to the 7800 and probably higher gpus thee is enough bandwidth in agp8x but i too feel that there is a conspiracy theory. personally i believe it is big companies trying to make you believe you need 2xgpus and dcs to play a game at 1600x1200, but my question is how many people game at that resolution? with all the 17 and 19" lcds panels out there, 1280x1024 should be a resolution tested. what do we get when all of us agp users don't need to upgrade because our cpus are still going just fine - that neutered 7800gs for ~$300 :(

imo, dcs are excellent but are not and were not designed for gaming, again marketing and it has obviously paid off. a dc will rape a sc in any smp/smt application - think audio/video rendering where true cpu horsepower is needed.

quite a few of my friends are rebuilding their rigs - dcs and sli, and they can't understand why i run a sc (opty [email protected]) along with a agp card (x800xtpe). i tell them the reality but they don't want to listen and just believe the hype. glad you guys proved everything i said correctly over the years, that ~2.0GHz for amd is what is needed and 3.0GHz for intel. i know that benchmarking takes a lot of time, but why not a single amd sc, even something like a 3200venice or 3700diego?
 
Since Kyle and I talked about this a couple weeks ago, in the sense that I said "Gee, I'd like to see. . " and he said "stay tuned", I thot it incumbent on me to drop by and say Thanks for making it happen.

I hope the obvious appreciation here from your regulars will encourage you to make it a periodic revisitation of the issue. Once a year sounds about right to me, tho possibly not to Kyle and Brent! :)

A side note on Dual-core. The recent MS GDC presentations included one on programming for dual-core, and interesting enuf one of the presenters was an XB360 guy. It seems clear that the next gen consoles are going to help here significantly as time goes by. I found it hopeful that the XB guy identified "file decompression" as the #1 common heavy-cpu use on XB360 --that seems like at least one easy win for dual core as devs take it up, sidestepping the well-documented thorny issues of implementing dual-core and not mucking up game play.
 
Stemcellio said:
I have to sit myself firmly in the dual core camp, although I do know a person or two who think a PC is a gaming machine with surfing, Email and IM, for most of us it is far more than that.
I used to think this way. Then I found the joys of the HTPC! :cool:

<snip>

Stemcellio said:
I would recommend to anyone but a truly obsessed gamer to go for an X2 3800+ or an Opteron 165 (for the overclocker) or even a fast something P4 rather than saving a few bucks by getting a single core A64.
I'm thinking Opty 165. Should make quite the difference, seeing as my current rig is an XP2500+. :D
 
P4 3.0E Prescott 478 / BFG 6800GS AGP 8x 256Mb
1 Gb DDR400 (PC3200) Mushkin Enhanced RAM

Do I really need to overlock this to 3.2Ghz? Is there a bottleneck with a difference of 200 Mhz or is that only in relation to a faster GPU?
 
Great article

Great site

thanks

I have two AMD 64 3200+ processors, one in S-754 and one in s-939. Guess I will keep them for another year or two.




 
wow... how did this write up slip under the radar? this is a great article, and I am glad kyle refrenced it in the conroe discussion.

I also think it makes a difference if you use an oporating system made for multiprocessors. I saw an improvement in my dual system moving from windows XP to windows server 2003.

talk about future proof, I can still game on my 5 year plus system (patheticly, but I can)

thanks guys.
 
Back
Top