I don't see very many people denying that Conroe is faster in many applications except for GPU-bound games. That's not the crux of the debate, as far as I can tell. The real problem is why was this "real-world" approach chosen for the Conroe review (which btw is more consistent with his other reviews, video cards for instance), while for the AM2 review a couple months ago he chose to scale down rez to 800x600 to eliminate the GPU as a factor? I'm not gonna jump the gun and say that he's trying to put AMD processors in a better light like some will, but it's not exactly consistent, now is it? I see that he's planning to write an editorial to clear the air, and I hope he addresses this particular question. Why do real-world now and not 2 months ago for the AM2 as well? I am eager to read his explanation, and I hope that it's good.metallicafan said:Maybe the best post out of the 900 or so posts in this thread. When those of you complaining in this thread realize that Conroe is indeed faster in many applications except our highly graphical GPU bound games the quicker this issue can be put to rest.
The [H] benchmarks certainly offer a different perspective on performance of various computer parts, and that's a good thing. Taken on its own it is fine as a review. But compared to the methodology used in the AM2 review a couple months ago it looks a bit strange. You can't deny that it looks just a little iffy. Once again, I'm not implying any deliberate wrong-doing, but you can see why it raises questions.Often times I see people complain that people put too much stock in synthetic benchmarks, and that 3DMark means nothing about in-game performance. Its more or less the same thing here. Kyle could have done what every other review out there did, and what most people already believed to be true, and showed that Conroe is indeed a faster CPU. But he chose to give the truth in that at the resolutions and graphical settings most of us play at there arent a lot of gaming benefits to buying a $1000, top of the line Conroe.
Eh? Are people calling 800x600 a real-world benchmark? I certainly wouldn't.Bottom line: Measuring CPU performance on a game at 800x600 resolution is more or less a synthetic benchmark like 3DMark. It sure cant be called a "Real World" benchmark like so many people have asked for.
All I want to see is consistency. If real-world is the way [H] does things, then do that always. If I want reviews that test parts at identical rendering settings I can go somewhere else. If I want to see "real-world" I come here. Just don't mix and match. It just makes [H] look bad.