Not being rich, I faced the choice of buying either decent timing RAM at 3200 or RAM rated at a higher clock speed (3500, 4000, etc). I chose the former. I've read all the memory FAQs I can lay my hands on and I believe I understand the basics, but can someone please explain to me what the difference is between a 3500 stick running slow timings at say 250 MHz (I'm talking Intel) and a 3200 stick running the same slow timings at 250 MHz ? Admittedly I haven't tried doing this with my 3200 sticks yet. I have some 3rd quarter 2003 Corsair 3200 LLPT TwinX factory rated at 2,3,3,6, though I'm running them at 2,3,2,6 and I've just purchased some Kingston HyperX 3200 factory rated as 2,3,2,6.
Should respectable 3200 like the above, on paper at least, do 250 MHz at the slow timings of a 3500 stick? I'm not too worried about the real world application of this but I am curious how people's milage goes.
Should respectable 3200 like the above, on paper at least, do 250 MHz at the slow timings of a 3500 stick? I'm not too worried about the real world application of this but I am curious how people's milage goes.