video card for 1280 x 1024

dinlee23

Gawd
Joined
Aug 28, 2008
Messages
835
currently running a OCd 5770 @960/1400

i want to get a faster card within 300 dollars any suggestions?
 
That card is mainstream, but your resolution is low. What games do you have an issue with? What is your CPU usage? What video settings are you using?

Benchmark these games at 1280x1024 as well as 640x480 using the same visual quality and report back.
 
bfbc2 and shogun 2 total war in particular. I can play bc2 all high dx11 with 2xAA and 4xAF and shogun 2 all high dx9. I just want to max them out. Is a 6950 a good card? or is it plain overkill for my res. Im not planning to upgrade monitors anytime soon.
 
its running at a 100%... i cant overclock it more cause im limited by my motherboard and it would be a waste to buy another s775 mobo so ill just tide it out till i upgrade my whole system
 
that is your problem then,your CPU is bottlenecking your card,since the res is so low,alot of the processing gets pushed onto your CPU.
 
i dont think a Hd 5770 has enough processing power to slow down a Q9550 even at stock settings
 
once your CPU hits 100% your game starts lagging,since it has to wait to start processing the set of information from the GPU,hence the term bottle neck,everything starts getting backed up from that point on.
 
bfbc2 and shogun 2 total war in particular. I can play bc2 all high dx11 with 2xAA and 4xAF and shogun 2 all high dx9. I just want to max them out. Is a 6950 a good card? or is it plain overkill for my res. Im not planning to upgrade monitors anytime soon.


what kind of fps are you getting on those two games with those settings?
 
okay just did benchmark with crysis and shogun 2 total war on dx11

crysis dx10 veryhigh settings except objects at high loops=3

max fps. 50.42 avg fps 39.28

shogun 2 total war dx11 max settings battle of sekigahara scene

avg fps. 40.42
 
what kind of fps are you getting on those two games with those settings?

in bfbc2 when theres stuff blowing right in front of me and whenever people spam smoke grenades fps drops to mid 20's

shogun 2 dx11 in the battle map it stutters quite a bit and in battle i get 30's and low teens when i zoom in to see the battle upclose
 
that is your problem then,your CPU is bottlenecking your card,since the res is so low,alot of the processing gets pushed onto your CPU.

+1

Get a bigger res monitor OP and you will notice a huge difference. That resolution is tiny for gaming this day and age and even the best of CPUs would bottleneck on it, regardless of the GPU(s).
 
+1

Get a bigger res monitor OP and you will notice a huge difference. That resolution is tiny for gaming this day and age and even the best of CPUs would bottleneck on it, regardless of the GPU(s).



if he gets a higher resolution monitor while keeping everything else the same his fps will go lower not higher.
 
The OP is correct: a card of the class of a 5770 CANNOT max-out BF:BC2 at 1280x1024.

I was running my 4850 at 1280z960 4xAA 16xAF in DX10 mode, and it barely handled that. When I upgraded to a GTX 460 I was able to run DX11 mode with HBAO and 16xAA settings with transparency AA, and even cranked now it feels smoother. There is a definite difference.

That quad Core 2 should keep both games fed just fine. I was running BF:BC2 on a Core 2 Duo 2.66.
 
im just asking if a 6950 is overkill for my res or can i cut corners and get a 6870... i just want to max everything out
 
6870 might be too much at that res, though the overkill is less then a 6950.

You would be much better off with a new GPU then a new CPU at this point, if you can only do one or the other.

A 6850/460 would be perfect at that res. Your CPU usage will be pretty high at that resolution, but you should have enough power.
 
if he gets a higher resolution monitor while keeping everything else the same his fps will go lower not higher.

No, if he's CPU bottlenecked by the low resolution, which is what it sounds like, the FPS will increase since the load will be taken off the CPU and transferred to the GPU at a higher resolution.
 
No, if he's CPU bottlenecked by the low resolution, which is what it sounds like, the FPS will increase since the load will be taken off the CPU and transferred to the GPU at a higher resolution.


That makes no sense at all. Zero.

If he were CPU bottlenecked, it would not be because of the "low resolution". It would be because of the CPU. Load isn't taken off of one and transferred to another. The CPU and GPU each do their own thing to their best capability. Meaning they run at their maximum unless a "bottleneck" isn't feeding them enough data.

Using your logic, he would have higher fps at 1680x1050, even higher at 1920x1080 and the highest fps at 2560x1600. How could you even begin to believe that?
 
At very low resolutions the CPU does get bottlenecked, regardless of how powerful the CPU or GPU is, at least in modern games.

You will get better performance at 1920x1080 than you will at 800x600 with that CPU and GPU combo.

If you don't believe me, then try it for yourself.

I've gotten huge increases in performance when I moved from a 1440x900 monitor to a 1920x1080 monitor.
Another person I know did the same thing and his fps jumped up by 20+.
Neither of us were GPU or CPU bottlenecked though.

On that same note:

If the user is CPU bound from the start but has a powerful GPU, the fps would be the same at all resolutions.

If the user is GPU bound from the start but has a powerful CPU, the fps will decrease as the resolution gets higher.
 
At very low resolutions the CPU does get bottlenecked, regardless of how powerful the CPU or GPU is, at least in modern games.

You will get better performance at 1920x1080 than you will at 800x600 with that CPU and GPU combo.

If you don't believe me, then try it for yourself.

I've gotten huge increases in performance when I moved from a 1440x900 monitor to a 1920x1080 monitor.
Another person I know did the same thing and his fps jumped up by 20+.
Neither of us were GPU or CPU bottlenecked though.

On that same note:

If the user is CPU bound from the start but has a powerful GPU, the fps would be the same at all resolutions.

If the user is GPU bound from the start but has a powerful CPU, the fps will decrease as the resolution gets higher.


:p So the computer will have lower fps when pushing half a million pixels versus when its pushing two million pixels?
 
It's not just the resolution (pixel count), it's the game that the CPU is driving. Yeah, that's about it. Look it up, it's very real.

Like I said, test it for yourself if you don't believe me.

Play a game at 800x600, then play it at 1920x1080 (assuming you are not CPU or GPU bound).
 
Play a game at 800x600, then play it at 1920x1080 (assuming you are not CPU or GPU bound).

They already do that at Tech Power Up for you.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/5.html

GTX 580, driven by Core i7 920 @ 3.8 = NOT particularly CPU OR GPU BOUND.

I don't see any of those benchmarks showing a FPS increase from 1024x768 up to 1920x1080. Explain to me again how this is supposed to work when you'rre asking the card to render four times as many pixels.

If you are CPU-LIMITED you might see no change in framerate when you increase your resolution, but that just means you need to turn-up your graphics eyecandy and enjoy the extra GPU power. So long as you have enough CPU power to deliver a playable framerate, you're fine.

If this effect you imagine were REAL, then you could point me to a real review showing it, right?
 
Last edited:
No, if he's CPU bottlenecked by the low resolution, which is what it sounds like, the FPS will increase since the load will be taken off the CPU and transferred to the GPU at a higher resolution.

At the very best if he was CPU capped his FPS would remain the same. Increasing the resolution will increase the load on the GPU and if the resolution is increased enough it could get to the point where he was GPU capped instead of CPU capped, but the load on the CPU is not going to be reduced.
 
already have my sights on diamond reference 6950 at my local frys probably buying it tomorrow
 
defaultuser, there's a ton of threads on this here. The search option is your friend. :)

dinlee, you will be happy with that card, once you get a nicer monitor, then you can really kick it into high gear.
 
defaultuser, there's a ton of threads on this here. The search option is your friend. :)

That's fine, you do the searching because you know what keywords, posters, subjects were involved. And when you can produce one of these mystical threads using your magic search skills, I will be happy to read it (and point-out the obvious inconsistency you missed).

Either point me to these threads yourself, or they don't exist. I'll be happy to provide you DOZENS of links to reputable benchmark websites that show this never happens. If it's so widely documented, you won't need to spend much time searching will you?
 
I guess I wasn't explaining myself correctly, it's not so much of a 'bottleneck' on the CPU as it is that the CPU is 'overloaded' with data to process.

At lower resolutions, more data is transferred between the cpu and gpu (due to the higher framerate), as well as the driver (which runs off of the CPU) having to work harder managing the higher frame rates.

Every time a frame is completed, new driver commands must be sent for each successive frame, so the GPU knows what it is doing. @ 60fps, that happens 60 times per second, but @ 120fps, that happens 120 times per second, so it's not really the lower resolution causing the need for more CPU, it's purely based on the number of frames rendered.

At higher resolutions, the same work is being done by the CPU per frame, so workload doesn't really change...it's all about the GPU's driver.

I hope this helps.

EDIT:

I've also proven this by trying it not only on one of my systems, but as well as another where giving the system a bigger res monitor, it was able to increase performance by 20fps+.

It may not be something openly discussed or mentioned in reviews, they aren't perfect nor do they tend to show many side issues or facts.
A quote from you:
It's pretty rare for reviewers to give a crap about CPU power when they run benchmarks of new games. They just run with some overclocked quad core and concentrate on the GPUs.
On that note, you want to show me benchmark websites that do just what you say, not focus on the CPU aspect of the benchmark.

Low-resolution CPU bottlenecking (or overload) is real and it can be and has been proven.
Don't believe everything you read on the 'net, like I said, try it for yourself.
 
Last edited:
It's not just the resolution (pixel count), it's the game that the CPU is driving. Yeah, that's about it. Look it up, it's very real.

Like I said, test it for yourself if you don't believe me.

Play a game at 800x600, then play it at 1920x1080 (assuming you are not CPU or GPU bound).


How about you look it up, its not real. Specs in my sig and running Bad Company 2 at 1280x1024 (won't let me select lower) I get about double the fps than I usually get at 1920x1080 :rolleyes:


I've also proven this by trying it not only on one of my systems, but as well as another where giving the system a bigger res monitor, it was able to increase performance by 20fps+.

It may not be something openly discussed or mentioned in reviews, they aren't perfect nor do they tend to show many side issues or facts.
A quote from you:
On that note, you want to show me benchmark websites that do just what you say, not focus on the CPU aspect of the benchmark.

Low-resolution CPU bottlenecking (or overload) is real and it can be and has been proven.
Don't believe everything you read on the 'net, like I said, try it for yourself.


I call bullshit on you having actually done this yourself.

You say this is a "secret" of the websites and that no one will discuss it. Well prove it. Post which games and under what conditions it will happen.

Bottlenecks are real but one of your problems is that you don't understand what they are.
 
Back
Top