Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Thanks for doing the footwork and getting us more links UtopiA. Really gives people an idea of how little CPU matters at high resolution in most games.
Neither should be a problem for gaming provided the Q6600 is overclocked to some degree, but since few games can make good use of more than two threads, the E8400 is generally considered better for gaming since it can normally clock higher.
This is precisely why I've been pushing quad-core in general (and the Q6600 in particular) since the SLACR Q-series parts became the generally available part. If anything, this is more true now given availability issues with E8xxx (Wolfdale).
Further, the few game types that still scale even relatively with CPU frequency (sims and RTS titles) are also among the most likely to take advantage of multiple cores (Supreme Commander is the most obvious example).
Given that the SLACR Q6600 (even with stock cooling) can crank to 3 GHz reliably on most (if not all) motherboards that support it today, and that once you go even to 3 GHz, if the game is *not* GPU-bound, it will more easily take advantage of more cores than further overclocking (even Crysis darn near walls out at 3 GHz in terms of CPU-related performance gains on any multiple-core CPU; in fact, the gains from the two additional cores on a Q6600 at the same 3 GHz easily-attainable clocking outweigh those obtained from maximized air-cooling any dual-core available to date). Therefore, unless it truly is all about the e-penis, given current pricing, quad-core (not just Q6600, but also Q9300/Q9450) wins in terms of usable bang-for-buck over any dual-core.
What is it that people aren't getting??? I mean seriously folks!! Stop giving ignorant advice please. It's been proven over and over that almost all games are GPU bound if you have appx 3.2Ghz or better, yet people keep saying that the 8400 clocks higher and is better for gaming.
I guess some people just don't get it. If it were me and I had to choose between a 6600 at 3.6 and an 8400 at 4.0 (knowing that anything over 3.2 will not bottlenech anything) the Q6600 would win hands down. Especially since it just hit $180!
Do some googling if you don't trust what I say, it's all over almost every web site that talks about hardware.
QFT. My Q6600 Chews up anything including crysis @ 3.0GHz. Plus, when I turn back to my desktop to get work done, the quad rips through everything.
Q6600 FTW, especially @ $180.
That old myth about clock-frequency is still in people's heads (despite first AMD, with the Athlon64, then Intel itself with Core/Core 2, making a mockery out of that myth). If that old myth were even close to true, then a P4 Northwood 2.6 would still be ahead of most, if not all, stock Core 2 CPUs (instead, even the latest Core 2 derivative, the infamous Celeron DC E1200, smacks the 2.6C from pillar to post).
Given the utter lack of CPU bottlenecking at 3 GHz (pretty much regardless of what game you play) and that more and more titles are indeed taking advantage of more than two cores (and there's actually an operating system that also takes advantage of more than two cores), why bet on tall overclocks with questionable benefits as opposed to the very real and measurable benefits of two more processor cores?
(Those of you near a local Fry's and/or MicroCenter should be paying even stricter attention, as both chains are running in-store-only sales on the retail Q6600; in neither case is the price more than $200. This is not just quad-core for the price of dual-core; this is quad-core for *less* than dual-core.)
Good grief what an informative post. This post alone has won the argument.
Yes but since the E8400 @ ~4Ghz gives at minimum ~3-5 boost over a Q6600 @ ~3.6Ghz it will make a huge difference if you're at a low fps (20-30, talking about Crysis here). At that fps every little boost matters, but once you hit around 40 the same ~3-5 difference really won't make a difference.
That old myth about clock-frequency is still in people's heads (despite first AMD, with the Athlon64, then Intel itself with Core/Core 2, making a mockery out of that myth). If that old myth were even close to true, then a P4 Northwood 2.6 would still be ahead of most, if not all, stock Core 2 CPUs (instead, even the latest Core 2 derivative, the infamous Celeron DC E1200, smacks the 2.6C from pillar to post).
Given the utter lack of CPU bottlenecking at 3 GHz (pretty much regardless of what game you play) and that more and more titles are indeed taking advantage of more than two cores (and there's actually an operating system that also takes advantage of more than two cores), why bet on tall overclocks with questionable benefits as opposed to the very real and measurable benefits of two more processor cores?
(Those of you near a local Fry's and/or MicroCenter should be paying even stricter attention, as both chains are running in-store-only sales on the retail Q6600; in neither case is the price more than $200. This is not just quad-core for the price of dual-core; this is quad-core for *less* than dual-core.)
Agreed with everyone who posted so far that the GPU is the main bottleneck for gaming HOWEVER.......
1 point that has not been brought up is the L2 cache.
Q6600 has 8mb L2 cache, or 2mb per core. This is good.
E8400 has 6mb L2 cache, or 3mb per core. This is better.
You will have 50% more L2 cache which will allow for a smoother experience, gaming or otherwise. (4mb total on 2 cores for the Q6600, 6mb total on 2 cores for the E8400).
Q6600 is more future proof as more games and other program WILL become multi-threaded in the future.
Its quite the toss up though. For gaming, they are both rock solid processors and you should be happy with either one.
This is so misleading it makes my head hurt.
First off a P4, amd64, and a core2 are not the same at all. They are different in many ways. So comparing clock speeds and saying a P4 at 2.6 is slower then an AMD64 or core2 is not an argument at all. Compare core2's at different speeds. A 3.0ghz one is going to cream a 2.4ghz one.
So clock speed is still vary important, but you can't measure it when comparing completely different chips that have nothing to do with each other. Unless you want to compare apples to corn chips.
The e8400 is a newer generation chip then the q6600. It runs cooler, OC's better, and unless you can really leverage those other 2 cores is the faster chip. The real argument for comparison will be when the quadcore 45nm chips hit.
It's funny, but some of you guys think that with an E8400 you just roll over and BAM! 4.0!
Sorry, but I've been trying for 2 weeks to get a stable 3.8 with a Chilltec TEC cooler and 8 GB of ram. Best I've gotten is 11 hours on Ortho before it blue screens.
So now I'm down to 3.6 stable with stock voltages except the Ram.
Performance wise, I had a Q6600 and traded up for the E8400. For my upgrade, everything is better now, especially since I also upgraded my OS to Vista 64bit.
CounterStrike Source stress test scores jumped by 14.6 fps (274.6 now). Other than that, it's really hard to see speed differences.
Video encoding has gone from an average of 35 minutes to roughly 25, but part of this is probably due to the 8 gigs of DDR2 that's now recognized and a 4 GB readyboost drive helping out.
In any case, if one of you has managed a stable 4.0 with the E8400 I'm certainly interested in your settings.
I take it you're running 4x2GB sticks of RAM? Why did you do 8GB to begin with? 98% of folks will never use more than 4GB. The memory might be the reaon you're limited. Vista is pretty finicky on memory and your motherboard is probably finicky about running 4 sticks as well. Take 2 out and see if can go any higher.
He stated that video editing/rendering is a primary use (and video work is a notorious memory pig); it's certain he's running an X64 version of Windows as well. Also, until recently, 4 GB sticks weren't exactly commonplace (also, he likely upgraded from 4 GB).
The *motherboard's* sweet spot, however, may not match that of the user. For such work, I tend to prefer high-end, if not workstation, system boards (because they can swallow larger amounts of memory reliably).