GTX 570 overclocking: Am I going to see a FPS difference?

You will see an increase in performance but it won't be drastic, like buying a different card, for example.
 
You will see an increase in performance but it won't be drastic, like buying a different card, for example.
again it scales almost perfectly so a 15% oc is almost a 15% performance increase. that is the same result as having a 15% faster card.
 
you dont have to oc the cpu that much for bf3, as long as you're running a quad core of any kind you should be good.

as for the gpu, i wouldnt oc stock video cards.

should have bought an MSI 570, it runs cool even at 900mhz.

fermi was disigned to run at high freq from the beggining so expect a good fps increase even from a small oc.
 
I have a reference Sparkle 570 and with the voltage turned up to a modest 1.025 it seems really stable at 885/1940 not spectacular on the memory side I know but it seems to make a difference to me. I've seen a few games that it makes a difference between being very playable at 4x msaa vs. 8x msaa even, with maxed out settings otherwise that is.
 
Yes you will till you switch v-sync on....

I mean nvidia uses samsungs 1ns gddr5 which is rated at 1ghz (4ghz) stock

Gpus run at?
 
Yes you will till you switch v-sync on....

I mean nvidia uses samsungs 1ns gddr5 which is rated at 1ghz (4ghz) stock

Gpus run at?
I know plenty of them use memory that is rated for 5ghz. it is the controller or I/O issues that keep it from reaching those speeds.
 
Yes you will till you switch v-sync on....

I mean nvidia uses samsungs 1ns gddr5 which is rated at 1ghz (4ghz) stock

Gpus run at?

I'm confused by this comment. Are you trying to say that the GPU needs to run at the same speed as the memory? Or that enabling Vsync will tie the VRAM speed to the GPU speed? Because neither of those things is true.

Edit: Or are you saying that you won't notice a difference in overclocked FPS once you turn vsync on?
 
I'm confused by this comment. Are you trying to say that the GPU needs to run at the same speed as the memory? Or that enabling Vsync will tie the VRAM speed to the GPU speed? Because neither of those things is true.

Edit: Or are you saying that you won't notice a difference in overclocked FPS once you turn vsync on?
just look at his other posts and you will see that he needs to be ignored.
 
Take it on a case by case basis. Its not very often that overclocking my video card makes a noticeable difference in performance. I almost always run my cards at stock speeds.
 
Take it on a case by case basis. Its not very often that overclocking my video card makes a noticeable difference in performance. I almost always run my cards at stock speeds.

I usually run a 5-10% overclock or so - just to show the card who's the boss.
 
Running my 570's at 900mhz made a significant difference, and I was running tri-sli. Much smoother overall and much higher minimum frame rates (as much as 40% higher) even though the max didn't increase by much.
 
Running my 570's at 900mhz made a significant difference, and I was running tri-sli. Much smoother overall and much higher minimum frame rates (as much as 40% higher) even though the max didn't increase by much.
there is no way ocing your gpus from 732 to 900 gave you a 40% increase in minimum framerate. for one thing it would be impossible that your minimum framerate with tri sli gtx570 would be that bottlenecked at stock clocks. and even those 3 cards were 100% bottlenecking the min framerate then a 23% would still not give you a 40% increase in minimum framerate. so no matter how you slice it, its not logical at all for you to get what you claimed.
 
there is no way ocing your gpus from 732 to 900 gave you a 40% increase in minimum framerate. for one thing it would be impossible that your minimum framerate with tri sli gtx570 would be that bottlenecked at stock clocks. and even those 3 cards were 100% bottlenecking the min framerate then a 23% would still not give you a 40% increase in minimum framerate. so no matter how you slice it, its not logical at all for you to get what you claimed.

I'm not sure why I would lie about it, does it make me look cool? Just stating what I observed. For example, my minimums in Crysis 2 benchmark tool DX11/Highres went from an average of 32fps over 5 runs to 54fps over 5 runs. I'm not really sure where you're getting your math from, are you just guessing that x amount of OC translates in a linear manner to y amount of % increase? Because that is definitely not the case with any card, especially since adding voltage causes diminishing returns.
 
I'm not sure why I would lie about it, does it make me look cool? Just stating what I observed. For example, my minimums in Crysis 2 benchmark tool DX11/Highres went from an average of 32fps over 5 runs to 54fps over 5 runs. I'm not really sure where you're getting your math from, are you just guessing that x amount of OC translates in a linear manner to y amount of % increase? Because that is definitely not the case with any card, especially since adding voltage causes diminishing returns.
I have NEVER seen a card scale over the amount it is overclocked. if anything it will be slightly less than the amount being overclocked. if I oc my gtx570 by 10%, I see close to 10% improvement but never over that. but in your case overclocking by 22% can result in 40% gains? again for it to even scale linearly, the cards had to be 100% the limitation. so in your case the cards were 100% the limitation and you still doubled your 22% gain by getting a 40% increase? do you actually think that gtx570 3 way sli is so slow that it can simply be oced by 22% and get a 40% increase? really just common sense would tell you that is not possible. maybe during a flyby benchmark things are different but certainly not during an actual game.
 
there is no way ocing your gpus from 732 to 900 gave you a 40% increase in minimum framerate

No one ever talks about the proper way to measure minimum framerate. It really needs a histogram to show how much time is being spent towards the minimum, because a single slow outlier frame may not even be related to the graphics sub-system.

Or, for 10000 frames what was the average of the slowest 100 frames.
 
No one ever talks about the proper way to measure minimum framerate. It really needs a histogram to show how much time is being spent towards the minimum, because a single slow frame might not be related to the graphics sub-system.
and in a flyby benchmark it can be even worse as an indicator of performance. sometimes its just one spot in those and people will try and base something off of that which silly. I especially love when people try and compare the nearly single digit minumum framerates in the Metro 2033 benchmark as if it means anything.
 
Obviously not every game responded the same way, some were as little as 2-3fps increase on minimums. I'm not sure why you feel like it isn't possible to have a higher FPS gain than the actual OC %, granted its not to be expected but there were a few situations where I saw it. Don't forget, multiple GPU setups scale differently than single. Just take a look at the 6950 which can achieve over 100% scaling in CF. By your logic a 100% gain in GPU power should not be able to result in over 100% gain in FPS, no?


Personally, I tested a variety of cards using canned benches and actual gameplay. It was by no means scientific like an [H] review because I'm just doing it for my own enjoyment, but I did log fps graphs and average them myself. I ran 6970's, 6990, 580's, 6870's, even an old pair of 5870's I borrowed.
 
please show me one review where a gpu has scaled more than the percentage it was overclocked. and multiple gpus are not going to scale any better with overclocking than a single gpu. in fact they are much more likely to scale worse than a single gpu due to more overhead and less likelihood of even being as gpu limited in the first place.

as for some crossfire setups getting more than twice the performance of a single gpu that is very unlikely. and when it does, its just barely over 100%.
 
and in a flyby benchmark it can be even worse as an indicator of performance. sometimes its just one spot in those and people will try and base something off of that which silly. I especially love when people try and compare the nearly single digit minumum framerates in the Metro 2033 benchmark as if it means anything.

This is why I like custom benchmarks. Its not hard to find an area where in a game that annoys you due to dips in framerates and just running through a few times with fraps.
 
To see if theres a performance increase OC the gpu engine first then use this formula

amdhal.png
 
Thats assuming that performance scales linearly with your overclock which it does not.
 
I have an EVGA SC, wondering what I could realistically expect in terms of core/mem. At the same time, I'm getting 60-70fps in BF with ultra textures and eveything else on high and it looks amazing so do I even bother?
 
I have an EVGA SC, wondering what I could realistically expect in terms of core/mem. At the same time, I'm getting 60-70fps in BF with ultra textures and eveything else on high and it looks amazing so do I even bother?

I wouldnt.
 
got my 570 to 905/1810/2226 i did notice an increase in performance, i use msi and have the voltage at 1075mv without issue for many months now
 
Back
Top