good goobly goo, mine would probably blow up long before it could hold those clocks.
didnt really do much for performance
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
good goobly goo, mine would probably blow up long before it could hold those clocks.
that does not sound right. my performance scales almost perfectly when I oc my gtx570.didnt really do much for performance
again it scales almost perfectly so a 15% oc is almost a 15% performance increase. that is the same result as having a 15% faster card.You will see an increase in performance but it won't be drastic, like buying a different card, for example.
again it scales almost perfectly so a 15% oc is almost a 15% performance increase. that is the same result as having a 15% faster card.
yes or maybe expecting too much.Maybe he's being limited somewhere else?
Maybe he's being limited somewhere else?
that does not sound right. my performance scales almost perfectly when I oc my gtx570.
I know plenty of them use memory that is rated for 5ghz. it is the controller or I/O issues that keep it from reaching those speeds.Yes you will till you switch v-sync on....
I mean nvidia uses samsungs 1ns gddr5 which is rated at 1ghz (4ghz) stock
Gpus run at?
Yes you will till you switch v-sync on....
I mean nvidia uses samsungs 1ns gddr5 which is rated at 1ghz (4ghz) stock
Gpus run at?
just look at his other posts and you will see that he needs to be ignored.I'm confused by this comment. Are you trying to say that the GPU needs to run at the same speed as the memory? Or that enabling Vsync will tie the VRAM speed to the GPU speed? Because neither of those things is true.
Edit: Or are you saying that you won't notice a difference in overclocked FPS once you turn vsync on?
just look at his other posts and you will see that he needs to be ignored.
Take it on a case by case basis. Its not very often that overclocking my video card makes a noticeable difference in performance. I almost always run my cards at stock speeds.
there is no way ocing your gpus from 732 to 900 gave you a 40% increase in minimum framerate. for one thing it would be impossible that your minimum framerate with tri sli gtx570 would be that bottlenecked at stock clocks. and even those 3 cards were 100% bottlenecking the min framerate then a 23% would still not give you a 40% increase in minimum framerate. so no matter how you slice it, its not logical at all for you to get what you claimed.Running my 570's at 900mhz made a significant difference, and I was running tri-sli. Much smoother overall and much higher minimum frame rates (as much as 40% higher) even though the max didn't increase by much.
there is no way ocing your gpus from 732 to 900 gave you a 40% increase in minimum framerate. for one thing it would be impossible that your minimum framerate with tri sli gtx570 would be that bottlenecked at stock clocks. and even those 3 cards were 100% bottlenecking the min framerate then a 23% would still not give you a 40% increase in minimum framerate. so no matter how you slice it, its not logical at all for you to get what you claimed.
I have NEVER seen a card scale over the amount it is overclocked. if anything it will be slightly less than the amount being overclocked. if I oc my gtx570 by 10%, I see close to 10% improvement but never over that. but in your case overclocking by 22% can result in 40% gains? again for it to even scale linearly, the cards had to be 100% the limitation. so in your case the cards were 100% the limitation and you still doubled your 22% gain by getting a 40% increase? do you actually think that gtx570 3 way sli is so slow that it can simply be oced by 22% and get a 40% increase? really just common sense would tell you that is not possible. maybe during a flyby benchmark things are different but certainly not during an actual game.I'm not sure why I would lie about it, does it make me look cool? Just stating what I observed. For example, my minimums in Crysis 2 benchmark tool DX11/Highres went from an average of 32fps over 5 runs to 54fps over 5 runs. I'm not really sure where you're getting your math from, are you just guessing that x amount of OC translates in a linear manner to y amount of % increase? Because that is definitely not the case with any card, especially since adding voltage causes diminishing returns.
there is no way ocing your gpus from 732 to 900 gave you a 40% increase in minimum framerate
and in a flyby benchmark it can be even worse as an indicator of performance. sometimes its just one spot in those and people will try and base something off of that which silly. I especially love when people try and compare the nearly single digit minumum framerates in the Metro 2033 benchmark as if it means anything.No one ever talks about the proper way to measure minimum framerate. It really needs a histogram to show how much time is being spent towards the minimum, because a single slow frame might not be related to the graphics sub-system.
and in a flyby benchmark it can be even worse as an indicator of performance. sometimes its just one spot in those and people will try and base something off of that which silly. I especially love when people try and compare the nearly single digit minumum framerates in the Metro 2033 benchmark as if it means anything.
I have an EVGA SC, wondering what I could realistically expect in terms of core/mem. At the same time, I'm getting 60-70fps in BF with ultra textures and eveything else on high and it looks amazing so do I even bother?
If it ain't broke...
If it ain't broke...
Keep pushing it?