Is something wrong with this GTX 295 I just bought?

Neon01

[H]ard|Gawd
Joined
Jan 22, 2008
Messages
1,048
Hi all, my equipment is in my sig. Just bought a used GTX 295 from someone here on the forum. Smooth transaction. I loaded up some games and didn't see as much improvement as I was expecting over my old 8800 GT. I was really expecting that this thing would run just about any game (even Crysis) at max settings 1920x1200 with min 40-50 fps. Was I expecting too much?

I started running some benchmarks to see what I'm actually getting and was a little suprised at how low the numbers are.

Using the FRAMEBUFFER Crysis Warhead Benchmarking tool (0.29), I ran the ambush demo on (Enthusiast mode, DX10, 1920x1200, 4x FSAA) and then again with all the same settings only on Gamer mode. I got the following average FPS:

Enthusiast - 22.40 (min 7.6)
Gamer - 31.17 (min 1.12)

This really doesn't sound right. Plus I noticed that there were severe loading problems when the initial cuscene is zooming into the island on the black and white overland map. It stuttered and took at least 60 seconds to get into the actual demo. Is that normal?

Also, I didn't have a benchmark for Sacred 2, but its definitely not completely fluid with everything maxed at 1920x1200, and this I suspect is really not normal either, considering its much less demanding than Crysis Warhead.

Is there some test I can do to determine if my card is working correctly?

Any advice would be appreciated
 
you didnt run these benchmarks with your old video card? thatd be your best comparison.
 
Unfortunately, no. I guess my alarm came from many of the benchmarks I read about online.

Here's a screen grab from someone who ran the same Crysis Warhead benchmark program I ran:

http://www.overclock.net/nvidia/440673-my-gtx-295-benchmarks.html (see about 2/3 of the way down the page).

Now I'll admit that he's got a much beefier processor than mine (and more ram), but his numbers are MUCH higher. He's showing an average framerate of ~62 fps on the exact test I ran, with a min rate of ~25 fps. For this same test I got the results above - 22 fps average, 7.6 fps min.

Here's another benchmark review that shows results with the GTX 295 to be 40 FPS average for Crysis Warhead (same settings except 0xAA instead of my 4x).

I know its a bit of apples to oranges, but 40 fps vs 22 or 62 vs 22 are _significant_ differences that probably aren't attributable to processor or RAM speed.

Could it be that my card isn't working in SLI mode or something? Any other possibilities?
 
dude dont forget your only running that 295 with a E8400 thats at 3.0 ghz itl probably wayyy difrent if you ran that proc at 4ghz. just OC it and youll get some pretty good results
 
Most c2d past 3ghz performthe same, anyways.

A 4ghz oc will not be a long term one.

I guess it's now:

does the second gpu do anything? (I guess furnark can check this/ 3dmark vantage)
Is multigpu or multimonitor enabled in yur nvidia cp?
Do you have the same forceware version az your refernce bench installed? Some games do better in one forceware ersion, and newer isn't always better.
Posted via [H] Mobile Device
 
Crysis is one of the few games that would take advantage of a quad core CPU. If you are all that worried about Crysis benchmarks, that is what you should upgrade.
 
Whoops, forgot to update my sig. My cpu is actually OCed at 3.6 right now. I didn't want to go much higher than this to avoid changing voltages.
 
you are expecting too much from GTX 295.

Crysis with 40-50 min fps on max setting is completely impossible with that card.

Aside from stuttering issue from some map, this card isn't doing so well in Crysis in general, even 4870X2 outperform it in this game, but partially due to ATI flavor on max setting for some odd reason.
 
you are expecting too much from GTX 295.

Crysis with 40-50 min fps on max setting is completely impossible with that card.

Aside from stuttering issue from some map, this card isn't doing so well in Crysis in general, even 4870X2 outperform it in this game, but partially due to ATI flavor on max setting for some odd reason.

Wow, I'm really glad I didn't get the HD 4890 like I was planning on then. I had no idea Crysis was so tough on video cards. From all the reviews it looked like the GTX295 could handle just about anything thrown at it in 1920x1200.

I'm not looking for 40-50 min, but 40-50 average is still more than double what I'm currently getting. If you look at my numbers, on maxed settings I'm getting average 22, min 7.

Does anyone have any objective bench tests I could do to see if the card is defective? If it is I'd like to know about it now just after I received it from the seller.
 
I think your CPU is holding you back.

When I first installed my GTX295, I was running a Q6600 at 3.6Ghz. About a month later, I upgraded my CPU to a Q9650, which I have been running at a stable overclock of 4Ghz for the past 9 months or so ... I noticed an immediate increase in frames in games like Crysis and Farcry 2.

A lot of games these days hit all 4 cores; unless you have a VERY high clock on your dual; I'd recommend picking up a q9550 or q9650, clocking it to around 4Ghz and having fun with nice frames.

You could always change to an x58 or p55 platform too - but you then have the cost of the CPU and Motherboard.
 
How can you guys achieve 4 ghz on an E8400? lol

I can barely get 3.6ghz using a TRUE and a rocket fish case w- awesome airflow
 
How can you guys achieve 4 ghz on an E8400? lol

I can barely get 3.6ghz using a TRUE and a rocket fish case w- awesome airflow

Seriously? Going to 3.6 didn't do anything to my temps at all. I never even had to change voltage. I could easily see 4.0 being possible without much trouble.
 
I think your CPU is holding you back.

A lot of games these days hit all 4 cores; unless you have a VERY high clock on your dual; I'd recommend picking up a q9550 or q9650, clocking it to around 4Ghz and having fun with nice frames.

Its interesting though, after looking at benchmark tests comparing the Q9550 to the E8500, like this one http://www.tomshardware.com/reviews/core-2-overclock,2146-12.html It doesn't appear that the Q9550 has much of an edge in Crysis. In fact, at 1920x1200 they are arguably tied (with the same video card - HD 4870x2).

Also of note, they are getting MUCH better numbers in Crysis than I was with that E8500 (~32 fps vs my ~22 fps). Of course, they're using a 4870x2, not a GTX295.

I'm usually quick to open up the checkbook if I know I can get some good gains with a new piece of equipment, but I'm not seeing the proof in the benchmarks I'm reading online.

I suppose I'll just have to turn some of the settings down and deal with it.
 
Seriously? Going to 3.6 didn't do anything to my temps at all. I never even had to change voltage. I could easily see 4.0 being possible without much trouble.

Yah, the temps are still fine... but the system wont boot or be stable above 3.6.

I guess I am just getting obsolete and dont have enough time to educate myself with these new boards and all their settings.

I miss the days with unlocked Athlons that required 2-3 BIOS settings to get insane OCs. :(
 
I had almost the exact same set up but my E8400 hit 4.3 on air. I probably impacted the life span of that chip and it was hot as Hades in my case lol. My 295 rest at home with an i7 860 and evga P55 LE board. Unlocked a bit of the potential left in the card.
 
it's well know that all C2D/C2Q cpu bottleneck a GTX295 card even an I7 on stock speed do the same i think it's pass 3.8ghz on the I7 where you are able to see the full potential on the gtx 295
 
it's well know that all C2D/C2Q cpu bottleneck a GTX295 card even an I7 on stock speed do the same i think it's pass 3.8ghz on the I7 where you are able to see the full potential on the gtx 295

this

edit:

here's what i get with 2 GTX 285's in sli and a core i5 at 4ghz in crysis warhead ambush 1920x1080 4xaa 16xaf Enthusiast:

DirectX 10 ENTHUSIAST 3X @ Map: ambush @ 0 1920 x 1080 AA 4x
==> Framerate [ Min: 32.67 Max: 95.11 Avg: 67.77 ]
 
Last edited:
Crysis is one of the few games that would take advantage of a quad core CPU. If you are all that worried about Crysis benchmarks, that is what you should upgrade.

Categorically false. At most it will put a 75% load on a dual core. It has been tested many times, and any 4-core usage is the game trading cores for each process. It does not use 4 cores.

Maybe I missed it, but I don't remember seeing anyone suggest making sure SLi is enabled in the control panel.
 
4870x2 did benefit from overclocking E8400. For every 100Mhz OC, I got 1fps more in avg in crysis bench. 3Ghz --> 4.2 Ghz = 12fps more in avg. But beyond 4Ghz, my cpu had to much voltage and even then, its wasnt 100% stable :p
 
My guess is it's drivers all around. When I had my GTX295, Warhead @ 2560x1600 sucked, the drivers were just awful, as well as in a few other games (Fallout 3 is another). That said, sorry, I haven't read the whole thread, but did you try using Driver Sweeper and doing a clean install of the drivers?
 
4870x2 did benefit from overclocking E8400. For every 100Mhz OC, I got 1fps more in avg in crysis bench. 3Ghz --> 4.2 Ghz = 12fps more in avg. But beyond 4Ghz, my cpu had to much voltage and even then, its wasnt 100% stable :p

uh? really? 12fps?

I did OC my Q6600 up to 3.71G, and so far it only have fps increase up to 2.8G, beyond that there is no more fps increase.....maybe 0.5 but that doesn't count..
 
uh? really? 12fps?

I did OC my Q6600 up to 3.71G, and so far it only have fps increase up to 2.8G, beyond that there is no more fps increase.....maybe 0.5 but that doesn't count..


IF I remember correctly, I think I had 32 fps in avg in crysis bench with E8400 running at 3Ghz.
In return, I got 40fps avg at 4Ghz.
the benchmark configuration was 1680x1050, all very high, 32bit(u get higher fps than 64bit) and DX10.
 
it's well know that all C2D/C2Q cpu bottleneck a GTX295 card even an I7 on stock speed do the same i think it's pass 3.8ghz on the I7 where you are able to see the full potential on the gtx 295

x1000

I, too, was hugly disappointed by the performance of the gtx295 in many titles when I got it. I was only getting slightly higher FPS (albeit, at higher detail settings) than I was with my 260, but not as much as I had hoped.

After doing some reading online, it seems that most CPU's (especially stock) will bottleneck the hell out of that card. I started upping my front side bus and finally maxed it out at 3.3 (so far... I plan on taking another stab at it soon) and it made a pretty large difference in performance to say the least. On a q6600, it seems that the dropoff point where OCing the processor didn't really provide much more gain was around the 3.6-3.8ghz range.
 
Wow, I'm not sure what happened to the instant email notification for subscribed threads...cause I got nothing on this one.

Well I decided to go for the Core I7 920. So I just picked up a Asus P6T, 6GB (3x2GB) of OCZ Gold 1600 RAM, and the I7 920 itself to compliment the rest of my system. :) I should have known I couldn't get away with upgrading one or two things without feeling the need to upgrade it all :eek:

I did check to make sure SLI is enabled in Nvidia control panel, and I did a clean install of Win 7 x64 when I upgraded to the Intel SSD and GTX295. That was with the newest drivers (195.62 WHQL I think?), so I don't think that's the issue.

After I looked around on the web for some proof that more CPU would translate to more FPS, I also came upon that Toms Hardware article independently, and that was what convinced me. However, even looking at the figures for what they're getting with the E8400 (33-34 FPS average at Enthusiast for 1920x1200), this is at least 5-10 FPS more than I'm getting, with pretty much the same system. And that's with me on an OC!

I did recently figure out that my RAM was bad, could that have affected it? I got >350 errors in MEMtestx86+ after about an hour. The weird thing is, if I hadn't gotten so many CRC errors upon game installation and torrents, I probably never would have known.

Anyway, I'll post back once I get the new setup running and see if I'm more in line with what the benchmark reviewers show.

Thanks again for all the really good discussion here, its been a big help (mainly in convincing me to buy a new CPU!)
 
By the way, does this look right:

Processor: Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz @ 3600 Mhz
CPU ID: Intel64 Family 6 Model 23 Stepping 6
Operating System: Microsoft Windows 7 Professional
Physical memory: 4.00 GB
Display adapter: NVIDIA GeForce GTX 295 896 MB
Driver version: 8.17.11.9562 (20091120000000.000000-000)

That's what I got for a diagnostic from Framebuffer Crysis Warhead benchmarking tool. Of particular note to me was the memory listed for the GTX 295. It says only 896. Shouldn't it be double that? Or does that have to do with the fact that its actually two cards?

Just ran the numbers again and here's what I got

DirectX 10 ENTHUSIAST 3X @ Map: ambush @ 0 1920 x 1200 AA 0xx
==> Framerate [ Min: 0.40 Max: 36.87 Avg: 26.29 ]
 
to overclock higher:

-Up your voltage littel by little until you finnally get it
-check your ram speeds and also increase the timings
-keep the GPU voltage at 100
-HAVE FUN!
 
By the way, does this look right:

Processor: Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz @ 3600 Mhz
CPU ID: Intel64 Family 6 Model 23 Stepping 6
Operating System: Microsoft Windows 7 Professional
Physical memory: 4.00 GB
Display adapter: NVIDIA GeForce GTX 295 896 MB
Driver version: 8.17.11.9562 (20091120000000.000000-000)

That's what I got for a diagnostic from Framebuffer Crysis Warhead benchmarking tool. Of particular note to me was the memory listed for the GTX 295. It says only 896. Shouldn't it be double that? Or does that have to do with the fact that its actually two cards?

Just ran the numbers again and here's what I got

DirectX 10 ENTHUSIAST 3X @ Map: ambush @ 0 1920 x 1200 AA 0xx
==> Framerate [ Min: 0.40 Max: 36.87 Avg: 26.29 ]

The report is technically correct. Although the 295 has double the total memory, each gpu core independantly is allocated 896MB of frame buffer. The memory is not 'shared' it is divided evenly with each gpu getting its own half to work with.

Now let's see the fraps benchmarks with the i7 system please. =)
 
Back
Top