Intel Core 2 Gaming Performance

I have been a loyal [H] reader since 1999. I have had this site as my home page since 2000. I just wanted to put that information out front.

This article was highly insulting to those of us who have been computer enthusiasts for a long time. Did you really think we wouldn't realize that the system is GPU bound at high resolutions and high AA/AF settings??

You performed a video card review on a new CPU! Not surprisingly, we learned almost nothing about how this new CPU performs. The most insulting part is that this came from a site and reviewer that I have trusted for years.

I have defended your method for reviewing video cards and the "real-world" performance of those cards. But, using that method to evaluate a CPU was idiotic. Especially so when you look at this quote from your recent review of the AM2 CPU's:

Gaming Benchmarks

As always, these benchmarks in no way represent real-world gameplay. They are all run at very low resolutions to try our best to remove the video card as a bottleneck.

This begs the question, why in the hell would you change your method of CPU testing now??

I'm sorry. I have never criticized the [H] or Kyle before today, but this was ridiculous. :(
 
Dosomo said:
This begs the question, why in the hell would you change your method of CPU testing now??
I was wondering the same thing myself. Why did you decide to change now?

By saying other benchmarks lied, aren't you saying that you lied, since you did the same benchmarks on AM2 only a few months ago?

Why, during your AM2 review, did you not use the high-res benchmarks, and therefore conclude that P4 and AM2 are equal?

Why did you not include a P4D in your testing this time? It would have gotten the same performance from the games, so you could have concluded that everyone should buy a P4D since they will be like the cheapest processors available soon.

Personally, I think both types of testing (low-res or high-res) can make sense, but only if the person testing stays within the limits of the method he chooses.
 
Yraen said:
More troublesome is the fact that the review kit shipped with a Bad Axe motherboard, yet you chose to use the Asus. Yes the Asus has the newer chipset, but it's not the chipset of choice for high-end gaming simply because it does not support crossfire. This board has been shown to support the 7950 you wanted to use.


Wonder what the results with that board showed? Didnt they say they tested for 45 days? I would guess that board was at least tried out! :confused:
 
Chrissicom said:
AMD is dead! :p An Athlon FX-62 costs 1000 EUR in Germany compared to 534 EUR a 5700 C2D costs. And then someone mentioned an X2 3800+ for 170 EUR being good... well the smallest C2D costs the same price and is definitely show MUCH better performance in real world applications and for games where the graphics card is the limit at least show equal performance. I would never ever ever buy any AMD CPU until they have reduced their prices at least 75% for the FX-62 for example.

at stock clocks the Core 2s are just as fast in normal gaming at res' most of us play at
if by 'real world' you mean super pi sure its faster so im not in to seeing how fast i can comupet pi to 1million places

or if by real world you mean wordprocessing and web browseing then i dont think i could tell what CPU i was using if it was over 2Ghz regardless of the CPU inculdeing Celarons >.>


now yes the Core 2s are a bit faster 3-5 FPS and up to 10 or so in some cases
and yes you can OC the hell out of the Core 2s soo yes if you OCed it to 4+ then it may be worth having and i may think of getting one then but i want to see if the retail Si can do what the ES can frist
 
Tazzman said:
Wonder what the results with that board showed? Didnt they say they tested for 45 days? I would guess that board was at least tried out! :confused:

Intel cut their testing time so it was a lot less than 45 days.

Then again, they also cut the testing time of all the other sites too...
 
HOCP4ME said:
Intel cut their testing time so it was a lot less than 45 days.

Then again, they also cut the testing time of all the other sites too...

Someware in this mess or the review I saw Kyle state it was 45 days.. If not oh well...I quoted about the bad axe board...mainly not the time..
 
Elios said:
at stock clocks the Core 2s are just as fast in normal gaming at res' most of us play at
if by 'real world' you mean super pi sure its faster so im not in to seeing how fast i can comupet pi to 1million places

or if by real world you mean wordprocessing and web browseing then i dont think i could tell what CPU i was using if it was over 2Ghz regardless of the CPU inculdeing Celarons >.>


now yes the Core 2s are a bit faster 3-5 FPS and up to 10 or so in some cases
and yes you can OC the hell out of the Core 2s soo yes if you OCed it to 4+ then it may be worth having and i may think of getting one then but i want to see if the retail Si can do what the ES can frist

In the real world, many more folks play at low res 1024 X 768 and the average Video card is not an X1900XT or a 7900GTX. Most of us do more than Game. If it were just for Games, I'd just update my 3500 and go with SLI!
 
Elios said:
at stock clocks the Core 2s are just as fast in normal gaming at res' most of us play at
if by 'real world' you mean super pi sure its faster so im not in to seeing how fast i can comupet pi to 1million places

or if by real world you mean wordprocessing and web browseing then i dont think i could tell what CPU i was using if it was over 2Ghz regardless of the CPU inculdeing Celarons >.>

now yes the Core 2s are a bit faster 3-5 FPS and up to 10 or so in some cases
and yes you can OC the hell out of the Core 2s soo yes if you OCed it to 4+ then it may be worth having and i may think of getting one then but i want to see if the retail Si can do what the ES can frist

Gaming is only a small market segment to what people use PC's for. A lot of people run their entire business from a PC and a lot of their work has to do with using productivity software in which most every case i've seen so far Conroe will have the advantage in performance and will consume less power. Those types of applications take a lot more CPU power then a measly word processor or web browser. Another large percentage of PC users do audio and video editing type work. Time is money in the business world. Not everyone just sits around playing PC games all day. And even with gaming, Conroe is still the faster processor no questions asked. The reason there is only a small difference in fps at higher resolutions is because the current video cards aren't capable of keeping track with the fastest processors today. In SLI/Crossfire benchmarks Conroe has a larger lead. When DX10 cards are released Conroe will have an even greater lead. Games are not programmed to unload all the work to the processor. It is the graphics card that does most of the heavy work so buying a CPU simply based on gaming performance is usually pointless. You're better off buying a lower-end processor and the most expensive graphics cards you can afford if all you care about is fps. And the retail chips will do everything that the ES chips have done because there is no difference between them outside of it being a newer stepping. At stock speeds the E6600 (2.4GHz) is faster then anything AMD currently has to offer so even if you didn't overclock it it would still be the better processor. At 4GHz it would be on a whole different level from the Opteron/FX/A64 line of processors, one in which they would not be competing BTW.
 
Donnie27 said:
In the real world, many more folks play at low res 1024 X 768 and the average Video card is not an X1900XT or a 7900GTX. Most of us do more than Game. If it were just for Games, I'd just update my 3500 and go with SLI!

Exactly, how many real world people would play F.E.A.R., Farcry, oblivion, Doom3 at 1600x1200 without turning eye candy options way down when these new games just released.
 
zumzum said:
Exactly, how many real world people would play F.E.A.R., oblivion, Doom3 at 1600x1200 without turning eye candy options way down when these new games just released.

Tons of them do.
 
burningrave101 said:
Gaming is only a small market segment to what people use PC's for. A lot of people run their entire business from a PC and a lot of their work has to do with using productivity software in which most every case i've seen so far Conroe will be much faster and consume less power. Those types of applications take a lot more CPU power then a measly word processor and web browser. Another large percentage of PC users do audio and video editing type work. Encoding is faster on Conroe. Time is money in the business world. Not everyone just sits around playing PC games. And even with gaming, Conroe is the faster processor here no questions asked. The reason there is only a small difference in fps at higher resolutions is because the video card used wasn't capable, not that Conroe wasn't capable of pulling further ahead. In SLI/Crossfire benchmarks Conroe has a larger lead. When DX10 cards are released Conroe will have an even greater lead. Games are not programmed to unload all the work to the processor. It is the graphics card that does most of the heavy work so buying a CPU simply based on gaming performance is usually pointless. You're better off buying a lower-end processor and the most expensive graphics cards you can afford if all you care about is fps. And the retail chips will do everything that the ES chips have done because there is no difference between them outside of it being a newer stepping. At stock speeds the E6600 (2.4GHz) is faster then anything AMD currently has to offer so even if you didn't overclock it it would still be the better processor. At 4GHz it would be on a whole different level from the Opteron/FX/A64 line of processors, one in which they would not be competing BTW.

those ppl arent going to be reading a review on a site like this there going to to some thing like c|net or maybe anandtech mostlikely there just going to read a review in indestry mag and go with what ever that say if thay have to already swiched to AMD if thay havent gone AMD in the last 2 years there just going to stay with Intel

as for Joe Sixpack well the numbers are clear on that hes just going to know its Intel and buy blindly Intel is already back on top with retail sales
 
"
Tons of them do"

can you back that thought up with some meaningful, statistically relevant data? :)

---
i also remember some games(just not which ones now) were released that could not be run at their full settings until future hardware came out. "Futureproofing" :)

i think the unreal people or was it id? I definately could be wrong on this. It was a few years ago i remember reading . Not sure if it became a trend or not.

more power? bring it on i say :)

I know of a person who loves to play AND encode video at the same time. hehe :)
 
Elios said:
those ppl arent going to be reading a review on a site like this there going to to some thing like c|net or maybe anandtech mostlikely there just going to read a review in indestry mag and go with what ever that say if thay have to already swiched to AMD if thay havent gone AMD in the last 2 years there just going to stay with Intel

as for Joe Sixpack well the numbers are clear on that hes just going to know its Intel and buy blindly Intel is already back on top with retail sales

Those type of people are reading sites like HardOCP right now. I'm one of them and there are thousands of others on forums across the net. Were not all teenagers playing video games all the time. I play PC games but one of my business fields is the IT industry which revolves around productivity of computers. I also like to work with audio and video content. One of the primary games i actually play is Oblivion which is a rather CPU intensive game because of the advanced AI influence and with a faster video card like the X1900 XTX a fast processor does make a few fps difference and a few fps is everything when it comes to playability in a game like Oblivion. If a person is satisfied with the current performance of their PC then there is no reason to upgrade now no matter how fast the new hardware is.
 
2 years from now, when there are GPUs on the market that are 2-3 times as powerful as todays GPUs, are todays games still going to be GPU limited? No. Will a ~30% performance difference between the two current top end offerings have an impact on gaming at that point? Yes. Say you've bought into hype and think you might need an upgrade, is Kyle's "CPU review" useful in determining if you need a new CPU today? Yes. (you probably don't). But, say you've already decided to upgrade, is Kyle's review useful in determining which CPU you want to buy? Not really.

I can't wait for [H]'s actual Core 2 CPU review.
 
Wow.

So much has been said in this thread, and it is easy to appreciate both sides of the argument.

From my perspective, I believe that the article did not quite show us what Core 2 is capable of. Yes, I understand that there were issues regarding the chipset used, as well issues with getting SLI to work.

Does this represent "real world" gaming? Perhaps so. Most people have video cards less or equal to a 7900GTX, so in that respect, the article did show people what to expect, and it certainly enabled me to evaluate possible upgrades in a more educated light.

Nevertheless, in the "real world", the Core 2 is also capable of gaming performance that humbles the Athlon FX, given the right components. That is a fact. It's reality. That is "the real world", but due to limitations in the benchmark methodology and equipment, that aspect was not shown or explained properly. The article showed a performance metric that is only valid in the context of the video card and resolutions used, and in that sense, it wasn't a CPU benchmark at all.

I believe that [H]|OCP, in the attempt to avoid a "canned" benchmark, delivered exactly that - a canned benchmark result set with predictable numbers. If that could not be avoided, then I think the responsibility was to wait until appropriate equipment could be used to show what Core 2 is capable of. But the pressures of delivering new content when the NDA expired, effectively turned [H]|OCP's article into an also-ran, which did not give readers a complete picture.
 
I think everybody understands these points, and in future articles Kyle will probably spell it out in kindergarten terms. The [H] review methodology does have value, it's just different than other sites. As I see it, there are three major problems with this particular review:

1) Intel pushed up their NDA release so he didn't have time to test crossfire or SLI, giving an incomplete picture of performance scaling.

2) If you don't read the review carefully, you may come out thinking that core2 is no big deal. But of course, it is. He needs to put in those kindergarten terms, not because the text is deceptive (maliciously or otherwise) but because it isn't really fit for its consumers on the web, skimming and clicking past each page.

3) Kyle doesn't take criticism well, even valid criticism, and is brutally unapologetic. Of course, it is his site.
 
4) He change his CPU review method starting with Conroe

5) Advises of Overcloacking the competition to make up the differance. Without ever mentioning how Conroe can easily get 1ghz OC aircooled furthering their performance aswell.

6) Also advices of holding off for price-cuts. ATi and Nvidia do it all the time aswell as Intel and AMD throught the life span of a CPU life cycle. You'll be forever waiting for each others price-cuts. Since the article was aimed at so-called "Enthusiast" they really dont need to hear about price-cuts. Price-cuts is more of the "Real-World" gamer which doesnt have the latest and greates hardware and is looking for the best "Bang per Buck".


Intel shorting its NDA schedule looks to be a excuse. Why is [H] making it such a big deal when every other site had the same scenerio.
Thats not an excuse to make a CPU review a GPU review.

Are we to expect The Next-Gen GPUs reviewd at 2560x1600 and come to the conclusion they are CPU bound due to their low FPS numbers ?
 
I think Kyle did this just to get a lot of n00b's to register and play cry-babies...and then make some $$$ of them :p
And so far it looks like it is working, clever thinking ;)

Terra - I have NO respect for people that CLEARY didn't read the review... :rolleyes:
 
You mean Freudian slip.

This post has deteriorated into a flame-fest.
 
Tzzird said:
You mean Freudian slip.

This post has deteriorated into a flame-fest.

Threads about new hardware often turn out that way yes.

Terra...
 
Speaking of new hardware (and because I'm too damn lazy) I figure this will get some views because everyone is comming back to read who quoted them last to flame their post, so I guess I should have just quoted someone to get people looking... maybe I'll ask again later that way If no one addresses this...

Anywho, does anyone know what the first Core chipset to support SLI will even be? Or will there be Crossfire boards available first?

And hell, if there were crossfire boards avaialable first, would you even buy one?
 
thedude42 said:
Speaking of new hardware (and because I'm too damn lazy) I figure this will get some views because everyone is comming back to read who quoted them last to flame their post, so I guess I should have just quoted someone to get people looking... maybe I'll ask again later that way If no one addresses this...

Anywho, does anyone know what the first Core chipset to support SLI will even be? Or will there be Crossfire boards available first?

And hell, if there were crossfire boards avaialable first, would you even buy one?

Crossfire board is alredy available in the form of Intel 975 chipset..from ATi however will be later ...I hope. Im waiting for the ATi ones.
 
Terra said:
Threads about new hardware often turn out that way yes.

Terra...

That's rather interesting considering your troll-like behaviour with the noobie statement. Put a sock in it, pal, or stay on topic. You're not impressing anybody.
 
FiestaMan said:
That's rather interesting considering your troll-like behaviour with the noobie statement. Put a sock in it, pal, or stay on topic. You're not impressing anybody.

Ditto :)

Terra...
 
I have this nagging feeling that the selection of benchmarks as well as the resolution and quality settings were carefully designed to come to the foregone conclusion that conroe isn't much better for gaming than the X2. Anyway, I'm not gonna repeat all the arguments that have been made before but I feel that calling other review sites "liars" and Intel's "cronies" because they chose to test the CPU instead of the 7900GTX was out of line. I hope that throwing whatever was left of his integrity outta the window has generated all the page impressions and ad-banner clicks Kyle has hoped for.

Anyway, if [H] is serious about evaluating Conroe gaming performance, I would suggest a follow-up article that evaluates Conroe vs. X2 performance in CPU limited scenarios. For example:

1. A flight sim. Not the most popular genre but they are notoriously CPU bound.

2. Turn-based strategy games. Set up some nice late game savegames on huge maps in Civ 4 and Galactic Civilizations 2 and use a stop watch to determine how long it takes the CPU to calculate the next turn.

3. Real-Time strategy games. The RoL benchmarks @ Anandtech show a 50%! superiority of Conroe over the X2. How about benchmarking a nice Rome: Total War savegame involving two large armies with large unit sizes?


Let's forget the high-resolution and AA stuff for a second: wouldn't it make sense to benchmark a CPU with games that actually need a lot of CPU power?
 
thedude42 said:
This quote from the article says it all:

"Having more CPU power is a very cool thing, but being able to utilize it is not an easy thing to do nowadays."
Yeah, it's a hard thing to do if you cherrypick your benchmarks to be not CPU limited. There are more games out there than these five, some of them CPU limited, at least in certain situations.
 
RedStarSQD said:
Thus, i'm sure you ment rwp for the majority of users. Well guess what..the majority of users have a nvidia 6600 GT video card. How does conroe help them!?

primary resolution used --1280 x 1024 and 1024*768.

Indeed i would think that less than 2% of gamers have the vid card used in your review :)
That's of course assuming that they go out, buy a brand new CPU, mobo and RAM and not upgrade their video card.

It seems that video cards that retail in the $150-250 price bracket are popular so you may have a bit of a point... in that case, benchmarking at 1280x1024 and 1024x768 with a 7600GT or a X1800XT with no AA and no AF would have been closer to the gaming experience of the "average" user who might buy a new computer today. But that's not what [H] tested.
 
RedStarSQD said:
"
Tons of them do"

can you back that thought up with some meaningful, statistically relevant data? :)

---
i also remember some games(just not which ones now) were released that could not be run at their full settings until future hardware came out. "Futureproofing" :)

i think the unreal people or was it id? I definately could be wrong on this. It was a few years ago i remember reading . Not sure if it became a trend or not.

more power? bring it on i say :)

I know of a person who loves to play AND encode video at the same time. hehe :)

Yes I can back that up, go to Steam via Valve and see the most common setup and resolution settings from the Game of the year from 2005 Half Life 2? By a large margin the most common setting is 1024 X 768. Polls on ZNet, GamePC, Game Stop, SImHQ and others say the same thing.

So we buy a game and search for settings by setting either hi or lo and either adding or removing Eye candy via Resolution, AF and AA.

Video card prices have increased to the point of Sillyness and many folks aren't going for it. Would I have enjoyed Half Life better at 1600 X 1200 rather than 1024 X 768 on an X800XL? Hell yes! Even as I got my X800XL on a black Friday sale from CompUSA for $199 while they were on sale while they sold regularly for $299 to $339, I still feel it should sold for $199 from the start LOL!

Point being, I'm NOT going to max out my system's GPU and neither will many others.
 
AndyHill said:
Jumping into a conversation may be a bit rude, but I think a little constructive criticism may be in order.

First of all I think your approach is justified in a way, it's goot to show people what kind of effect new technology actually has on their gaming experience. There's a problem, however: you left out hundreds of games and tested only a few. Let's go through a couple of examples from SimHQ.com Conroe review. I chose them because of the games they tested.

Scroll to the bottom of this page: http://www.simhq.com/_technology2/technology_090d.html for some Pacific Fighters tests (I won't blame you if you peek at the Falcon4 tests on the way). Wow. Now that's what I call a landslide victory. You can see how at bigger resolutions the results approach each other, but there's another issue at play. Simulators are usually heavily CPU-bound and they cut a lot of corners to achieve playable framerates. In fact, although I can't be sure, I'm pretty certain that the tests were run in an environment where there were relatively few units around. Running a dynamic campaign mission with hundreds of units on the ground and in the air MIGHT make 1600x1200 look like 640x480 does in favor of Conroe. Without seeing the CPU limited scenario at 640x480 we wouldn't even know of this possibility.

The next page has another interesting example: http://www.simhq.com/_technology2/technology_090e.html , note the Lock on: Flaming Cliffs scores. So much for modern games being GPU limited? Again, these scores are not from an environment with hundreds of units crawling around AFAIK.

So what if a couple of simulators are a bit different from the norm, no-one even plays them, right? The fact is that we don't know how most games play. There might be some that absolutely thrive on a Conroe and some that don't. The biggest problem in the Conroe review is that you take a couple of games and a specific set of settings and try to extrapolate it further than the data permits. It's even arguable that your chosen (high resolution) setting might actually be far from the norm nowadays. Not to mention that you only tested a couple of games and although they're popular ones, they're not the only ones around. That's why doing reviews like this isn't that useful.

The point in testing the CPU under different circumstances, especially under heavy CPU stress is to give people a chance to try to figure out how the CPUs might work in other relatively similiar situations. It's impossible to test all the games under all the settings so that you could claim to know how every game works on everyone's system. Predicting the future is difficult only if you need to be precise; you don't need a crystal ball to tell that in the future games will require more from the CPU and GPU than modern games. I can also tell right away that some games will be more CPU-limited while others are really demanding on the GPU. The worst cases will be demanding on both. If you're looking for a CPU that'll last longer you need to know which one has the most power (per $) in gaming situations and as far as this question goes the review fails to answer. Even if you only intend to answer the question "what will the CPU do to gamers right now" you have failed, because you only tested a couple of games under very specific circumstances. People playing other games or don't use your settings have little to gain from this review.

This'll probably just get buried in this monster thread, but at least I got to vent a little. I think this is a quality review site and I had to voice my concerns in case they actually help discussion or maybe even improve things for someone.

WOW! Great post!
 
"Yes I can back that up, go to Steam via Valve and see the most common setup and resolution settings from the Game of the year from 2005 Half Life 2? By a large margin the most common setting is 1024 X 768"

Yes i know..as i have already stated... the tons quote was from a guy saying that tons of people used 1600 res.
 
Wow. Can't say people aren't compassionate about the hardware? Its like a civil war in there kids, LOOK OUT! I had to stop 1/2 way in, didn't see anything productive really up to that point. But thanks for giving me something to compare a future upgrade on. Even if its not the only benchies I will look at. Just like EVERY other upgrade I have done.
You can always have an opinion folks, but the ones that seem to be able to eloquently convey them are just so much nicer. Just because you don't like the answer, means the right questions are asked? Stop the madness.
 
I don't know if the 1024x768 arguement is a valid one on hardocp. The majority of readers of this site are into spending top dollar on the fastest parts. I doubt any of them would prefer to play in 1024x768. I think a review testing the higher resolutions works for the typical [H] reader.

An "average" computer user is going to go with whatever CPU/GPU that Dell/HP/Gateway puts in the box.
 
enelson125 said:
I don't know if the 1024x768 arguement is a valid one on hardocp. The majority of readers of this site are into spending top dollar on the fastest parts. I doubt any of them would prefer to play in 1024x768. I think a review testing the higher resolutions works for the typical [H] reader.

An "average" computer user is going to go with whatever CPU/GPU that Dell/HP/Gateway puts in the box.

There is nothing wrong with benching at higher resolutions like 1600x1200 if you've the graphics hardware to handle it. I like to see both lower resolutions as well as higher resolutions so that you can compare to see if the performance advantage is being held back by the GPU though.

Anandtech, Tech Report, Firingsquad, and several of the other review sites chose to benchmark at higher resolutions as well but like with the Anandtech review they had an X1900 XT Crossfire setup and Conroe showed larger gains in performance at those higher resolutions.

Anandtech said:
Our first test is the "Town" benchmark we used in our Oblivion performance guides. Here the Core 2 Extreme X6800 manages a 26% performance advantage over the FX-62. While the E6600 is still faster than the FX-62, the E6300 loses a few places and finds itself offering performance somewhere in between the X2 4600+ and the 4200+. Keep in mind that our Oblivion tests are hand run using FRAPS so the variance between runs is much higher than normal; differences of up to 5% should be ignored to be on the safe side.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2795&p=16
 
"The majority of readers of this site are into spending top dollar on the fastest parts"

Based on what figures? So far we know about 3% of the gamers use 1600 res or above.

I am quite prepared to agree there will be a higher weighting of power users looking at this site...so lets weight it 7x the norm and give [H] 21% power user ratio. And then lets mention if you are a power user at the supreme end ..you also probably have sli.

I have no trouble with the article being called conroe gaming performance for high end users that visit [H] if that was the target audience of this gaming review :) --that and a couple other changes is all that is required :)

or, even just an apology for the liar and intel cronies comments :)
 
RedStarSQD said:
"The majority of readers of this site are into spending top dollar on the fastest parts"

Based on what figures? So far we know about 3% of the gamers use 1600 res or above.

I am quite prepared to agree there will be a higher weighting of power users looking at this site...so lets weight it 7x the norm and give [H] 21% power user ratio. And then lets mention if you are a power user at the supreme end ..you also probably have sli.

I have no trouble with the article being called conroe gaming performance for high end users that visit [H] if that was the target audience of this gaming review :) --that and a couple other changes is all that is required :)

or, even just an apology for the liar and intel cronies comments :)

I am probably assuming that there are more people here with super high-end rigs than there actually are. I only game at 1280x1024, though I would go higher if my 19" LCD didn't limit me.
 
AndyHill said:
Jumping into a conversation may be a bit rude, but I think a little constructive criticism may be in order.

First of all I think your approach is justified in a way, it's goot to show people what kind of effect new technology actually has on their gaming experience. There's a problem, however: you left out hundreds of games and tested only a few. Let's go through a couple of examples from SimHQ.com Conroe review. I chose them because of the games they tested.

Scroll to the bottom of this page: http://www.simhq.com/_technology2/technology_090d.html for some Pacific Fighters tests (I won't blame you if you peek at the Falcon4 tests on the way). Wow. Now that's what I call a landslide victory. You can see how at bigger resolutions the results approach each other, but there's another issue at play. Simulators are usually heavily CPU-bound and they cut a lot of corners to achieve playable framerates. In fact, although I can't be sure, I'm pretty certain that the tests were run in an environment where there were relatively few units around. Running a dynamic campaign mission with hundreds of units on the ground and in the air MIGHT make 1600x1200 look like 640x480 does in favor of Conroe. Without seeing the CPU limited scenario at 640x480 we wouldn't even know of this possibility.

The next page has another interesting example: http://www.simhq.com/_technology2/technology_090e.html , note the Lock on: Flaming Cliffs scores. So much for modern games being GPU limited? Again, these scores are not from an environment with hundreds of units crawling around AFAIK.

So what if a couple of simulators are a bit different from the norm, no-one even plays them, right? The fact is that we don't know how most games play. There might be some that absolutely thrive on a Conroe and some that don't. The biggest problem in the Conroe review is that you take a couple of games and a specific set of settings and try to extrapolate it further than the data permits. It's even arguable that your chosen (high resolution) setting might actually be far from the norm nowadays. Not to mention that you only tested a couple of games and although they're popular ones, they're not the only ones around. That's why doing reviews like this isn't that useful.

The point in testing the CPU under different circumstances, especially under heavy CPU stress is to give people a chance to try to figure out how the CPUs might work in other relatively similiar situations. It's impossible to test all the games under all the settings so that you could claim to know how every game works on everyone's system. Predicting the future is difficult only if you need to be precise; you don't need a crystal ball to tell that in the future games will require more from the CPU and GPU than modern games. I can also tell right away that some games will be more CPU-limited while others are really demanding on the GPU. The worst cases will be demanding on both. If you're looking for a CPU that'll last longer you need to know which one has the most power (per $) in gaming situations and as far as this question goes the review fails to answer. Even if you only intend to answer the question "what will the CPU do to gamers right now" you have failed, because you only tested a couple of games under very specific circumstances. People playing other games or don't use your settings have little to gain from this review.

This'll probably just get buried in this monster thread, but at least I got to vent a little. I think this is a quality review site and I had to voice my concerns in case they actually help discussion or maybe even improve things for someone.

An Excellent Post. I hope people keep bringing this up for Kyle to take a look at.
 
andyhill,

so sims > fps in terms of taxing PC gaming systems? how so? # of polygons+larger maps? i'd like to know more about this.

i thought games like CSS, FEAR, your usual round of fps would be max tax on PC sytems... but sims pwns them all? interesting. any links from anyone giving me more info on this?
 
Kiste said:
Yeah, it's a hard thing to do if you cherrypick your benchmarks to be not CPU limited. There are more games out there than these five, some of them CPU limited, at least in certain situations.

Oh yeah, good one! The current top MMO, multiplayer FPS, single player RPG, top racing game and top tactical shooter.... yeah, definitely cherry picked from an obscure list guaranteed to favor AMD.
 
Elios said:
at stock clocks the Core 2s are just as fast in normal gaming at res' most of us play at
if by 'real world' you mean super pi sure its faster so im not in to seeing how fast i can comupet pi to 1million places

or if by real world you mean wordprocessing and web browseing then i dont think i could tell what CPU i was using if it was over 2Ghz regardless of the CPU inculdeing Celarons >.>


now yes the Core 2s are a bit faster 3-5 FPS and up to 10 or so in some cases
and yes you can OC the hell out of the Core 2s soo yes if you OCed it to 4+ then it may be worth having and i may think of getting one then but i want to see if the retail Si can do what the ES can frist

I mean stuff like CAD and Video Encoding :D
 
Back
Top