GTX 295 - My Impressions

Mr. K6

Supreme [H]ardness
Joined
Mar 23, 2005
Messages
5,077
UPS delivered my GTX 295 at 12:35 today :D. I've always enjoyed reading user impressions and reviews on forums. This time I actual bought a product on release, so I though I'd return the favor. Here's some pics to start it off -



My initial impressions are DAMN this thing is heavy. Picking up the box felt like there was a brick in it. Its the standard 10.5" footprint of most NVIDIA high-end cards. The heatsink is coated in some kind of gummy paint, don't really know what it is, I'll have to look into it. Installed in my Antec 900, you can see it just fits, ~1in. of room between the drive cage and the card. Also you can see that it blocks the hard drive bay it is across from. I also had to move my mobo/RAM fan back as intake for the card is on the top and bottom, which was an interesting change.

On first boot-up, I noticed the card is very quiet at stock. I run all my fans on 5-7V, so my PC is silent, and the fan at stock wasn't load, and it has a comfortable tone.

I installed the newest WHQL drivers from the NVIDIA site, 181.20 released yesterday (Jan. 8), Rivantuner, and Furmark. It's interesting to be back on green team. Now on to clocking this sucker :D
 
So is the stock fan speed 40% at the minimum? You should get Rivatuner and try running the fan at 20% and see what your 2D idle temps & noise are like. If they are decent, I have a good way to setup your card to idle at 20% fan speed but ramp up nicely for 3D action via the low level fan settings.

dan
 
Congrats on the card, let us know real world performance. I like your setup. I was wondering where you got that silver adhesive that appears to be holding the fan on? Or the black ties holding the other one? I looked all over www.xoxide.com but couldnt find those mods anywhere.
 
So is the stock fan speed 40% at the minimum? You should get Rivatuner and try running the fan at 20% and see what your 2D idle temps & noise are like. If they are decent, I have a good way to setup your card to idle at 20% fan speed but ramp up nicely for 3D action via the low level fan settings.

dan
I've used Rivatuner many times before, don't worry, I'll definitely be setting up a custom fan curve once I figure out where my clocks stand. However, if something's changed or I can't find an option, you'll see me posting :). I tried out 30% fan once I got Rivatuner installed, and it's dead silent. Idle temp rose from 41C to 44C, which isn't a big deal considering its still on factory clocks and hasn't been undervolted.
Congrats on the card, let us know real world performance. I like your setup. I was wondering where you got that silver adhesive that appears to be holding the fan on? Or the black ties holding the other one? I looked all over www.xoxide.com but couldnt find those mods anywhere.
That's because it's good old fashioned duct tape and cable ties that come with various electronic products (I've collected a whole stash of them over the years) :D.

I use the duct tape on the the TRUE to seal the fan edges so it actually pushes most of its air through the heatsink rather than having it run off to the sides (due to how thinly spaced the fins in the TRUE are).

EDIT: So far so good, currently running 702MHz core, 1548MHz shaders, and 999MHz (stock) memory. I'm going to keep pushing the core and then I'll start on the memory :).
 
The heatsink is coated in some kind of gummy paint, don't really know what it is, I'll have to look into it.

This is what NVIDIA says about the board design concerning that paint:

The GeForce GTX 295 is shielded by a metal front cover for maximum
board protection. Soft touch paint is used for a matte look and feel.
 
EDIT: So far so good, currently running 702MHz core, 1548MHz shaders, and 999MHz (stock) memory. I'm going to keep pushing the core and then I'll start on the memory :).

The card is memory bandwidth limited, so IMO it is better to overclock the memory as high a it can go first, then the other stuff, the memory OC is going to be more important than the shaders and core unless you are folding or something.
 
This is what NVIDIA says about the board design concerning that paint:
lol, so much for it being some space-age heat-transfer coating :p. Thanks for finding that.
The card is memory bandwidth limited, so IMO it is better to overclock the memory as high a it can go first, then the other stuff, the memory OC is going to be more important than the shaders and core unless you are folding or something.
Yup, that's the plan. Out of habit I always start on the core/shaders. Once I find the max on those, I clock them back to stock and find the max on memory. Finally I try to combine the two, and you're right, in this case the preference is going to be for the memory clock.

EDIT: 702MHz started crashing after 5 minutes, so on to memory!
 
Ok, here's what I got, first I'll do individual testing, then I'll do combined. For testing I used Furmark, set to windowed, stability, 1920x1200 res with 4x AA. I used Rivatuner to adjust the clocks and fan. I've been using the same methodology for awhile, and it's quick and easy. First I fire up Furmark and let the card heat up to its stable temp (temp doesnt change for a minute). Then I start increasing the clocks. Once I get artifacting or a crash, I move on to the next component

Shaders - Started artifacting @ 1620MHz
Core - Driver crash/recover after 5 minutes of Furmark @ 702MHz (1512 shader)
Memory - Driver crash/recover @ 1296MHz

To test a combined overclock, I just estimate from the above data, subtracting 3-5% based on the evidence of crashing I saw. I tried out 679MHz core/1515MHz shader/1242MHz memory and it ran Furmark fine for 15minutes, not a single problem. I now need to do some real world, long term testing to see how it works.

Temps - This thing runs hot, no way around it. Using 100% fan, Furmark was bringing the core temps to 87C and 84C. GPU1 is ~3C lower than GPU0 consistently. 100% fan isn't that bothersome, with headphones I could probably game with it (although I won't).

Also note that I removed the black rubber grommet next to the card in the back of the case. These holes were made for water cooling, but that hole lines right up with the auxillary exhaust on the side of the card. The amount of air that comes out through the hole after I removed the grommet was quite substantial.

I have Left 4 Dead installed, so I'll try that now, next I'll install Crysis v 1.21 and give that a go. If anyone has any games they want to see tested or benchmarked, lemme know :D.
 
Any chance you have a 30"er?
That I do

Just played through Left 4 Dead No Mercy Campaign single player. 2560x1600, all options cranked. I started out at 4xMSAA, got 120+ FPS in most areas. I then tried 16xQ CSAA, still 120+ FPS :D. I think the game is mostly CPU limited then. Time to install Crysis :p.
 
run GTA IV and tell us how much memory it shows you allocated (display properties)
 
run GTA IV and tell us how much memory it shows you allocated (display properties)
Didn't buy it yet, based on the poor user reviews. I'd imagine it would say 896MB though :confused:.

Also an update, just switched the multi-GPU rendering option from "NVIDIA recommended" to AFR 1 and my FPS shot up 25% XD. Now playing L4D at 150FPS+ in most areas with 16xQ CSAA
 
That I do

Just played through Left 4 Dead No Mercy Campaign single player. 2560x1600, all options cranked. I started out at 4xMSAA, got 120+ FPS in most areas. I then tried 16xQ CSAA, still 120+ FPS :D. I think the game is mostly CPU limited then. Time to install Crysis :p.


:eek: Nice....Though I think you're right, L4D seems to be, overall, CPU limited.


Subbing this thread so I can see some more numbers later on. :cool:
 
I'm excited to hear about your general impressions of how smooth gameplay is and how stable it really ends up being overclocked. Thanks!
 
:eek: Nice....Though I think you're right, L4D seems to be, overall, CPU limited.
Subbing this thread so I can see some more numbers later on. :cool:
I'm excited to hear about your general impressions of how smooth gameplay is and how stable it really ends up being overclocked. Thanks!
Glad to help :D

Tested Crysis some, big frowny face there :(. Seems the drivers need some overall work with this game/engine. I first tried out 2560x1600, everything set to Very High. I loaded the first level, just as Nomad walks on to the beach after falling, with that big tortoise walking by. Sadly, FPS are all over the place. Average reads 38-40FPS, but the high is 60 and the low is 5-7, and it alternates many times a second, so its very jumpy. I originally thought "crap, must be paging" so I lowered everything to High. The FPS AVERAGE went higher, but SAME problem. Then I tried Medium, SAME problem. Then I thought "screw it, I'll lower the resolution." To sum it up, 1920x1200, same problem, 1680x1050, same problem. Note that at 1920x1200, the FPS fluctuates between 15FPS and 100FPS, but same signs really. 1680x1050 is even higher, but same signs. Anyway, seems the drivers need a little tweak somewhere. I tried using all the different Multi-GPU rending methods, no real change.
 
Crysis is well known for not utilizing multi-GPUs well.
Any tips or tricks? :D

EDIT: Running clocks of 690/1476/1242 currently. I started artifacting a bit while playing L4D, and determined it to be the shaders, so I tried pushing the core a little more to compensate. So far so good.
 
Ok, here's what I got, first I'll do individual testing, then I'll do combined. For testing I used Furmark, set to windowed, stability, 1920x1200 res with 4x AA. I used Rivatuner to adjust the clocks and fan. I've been using the same methodology for awhile, and it's quick and easy. First I fire up Furmark and let the card heat up to its stable temp (temp doesnt change for a minute). Then I start increasing the clocks. Once I get artifacting or a crash, I move on to the next component

Shaders - Started artifacting @ 1620MHz
Core - Driver crash/recover after 5 minutes of Furmark @ 702MHz (1512 shader)
Memory - Driver crash/recover @ 1296MHz

To test a combined overclock, I just estimate from the above data, subtracting 3-5% based on the evidence of crashing I saw. I tried out 679MHz core/1515MHz shader/1242MHz memory and it ran Furmark fine for 15minutes, not a single problem. I now need to do some real world, long term testing to see how it works.

Temps - This thing runs hot, no way around it. Using 100% fan, Furmark was bringing the core temps to 87C and 84C. GPU1 is ~3C lower than GPU0 consistently. 100% fan isn't that bothersome, with headphones I could probably game with it (although I won't).

Also note that I removed the black rubber grommet next to the card in the back of the case. These holes were made for water cooling, but that hole lines right up with the auxillary exhaust on the side of the card. The amount of air that comes out through the hole after I removed the grommet was quite substantial.

I have Left 4 Dead installed, so I'll try that now, next I'll install Crysis v 1.21 and give that a go. If anyone has any games they want to see tested or benchmarked, lemme know :D.

84C @100% fan, the card is hawt :eek: My X2 also works at that temp but with 40% fan speed in games. The card is inside a case right? I guess that the hot air released back into the case does play a role there.
 
to op . have you knowticed an increase in temps within your case, cpu , nb ?
also is the option to run just one of the gpu's in the nv control panel ?
 
Ok, got Far Cry 2 installed. Settings are 2560x1600, DX10, all settings as high as they can go (be it high, very high, or ultra high). I'm generally getting 40-70FPS with no AA, 35-60FPS with 2xAA and 25-55FPS with 4xAA. Everything is running very fluid. Right now I'm keeping my clocks at 675/1476/1242, so far so good.

What about microstuttering? Has that improved since the 9800 GX2?
From what I saw of microstuttering on a 9800GX2, no, I haven't seen anything like that on this GTX 295 so far. I've tested L4D, TF2, Far Cry 2, and Crysis, and with the exception of Crysis all games have been completely smooth.

84C @100% fan, the card is hawt :eek: My X2 also works at that temp but with 40% fan speed in games. The card is inside a case right? I guess that the hot air released back into the case does play a role there.
That's also because its FurMark though XD. I'm running 60% fan in games and I haven't seen the card come close to 80C. The card is also quieter and has a slower/not so vacuum sounding fan when compared to my 4870's fan; it's more of a whoosh.

to op . have you knowticed an increase in temps within your case, cpu , nb ?
also is the option to run just one of the gpu's in the nv control panel ?
Nope, one of the first things I started monitoring every now and then. I haven't seen a single temp change really, but I'll keep monitoring it. I think after removing that grommet, pretty much all the hot air gets pushed out the back of the case anyway. Also, the Antec 900 has fantastic airflow, even with low speed fans.

Hope I answered your questions alright :D Also, I note that after running a game and going back to windows, the card isn't downclocking itself. Is that a problem with NVIDIA drivers, Rivatuner, or what? Anyway, more to come :).
 
Nice clocks and read :cool:

This card is looking very good by every second i look at it, for those that are into the extreme side of overclocking, check this out, :cool:
 
Great thread MR K6. Do you think leaving the "Nvidia multi rendering" method set to AFR1 is optimal?
 
Nice clocks and read :cool:

This card is looking very good by every second i look at it, for those that are into the extreme side of overclocking, check this out, :cool:
I was reading that the night before my card came, that is insane. I like how they jimmied it together to fit the pot XD.

Great thread MR K6. Do you think leaving the "Nvidia multi rendering" method set to AFR1 is optimal?
In my limited testing, AFR1 has been better than split screen, especially in source games. I don't know if AFR1 or AFR2 is better from there though, I haven't really tested it much but there didn't seem to be much of a difference.
 
Quote (That's also because its FurMark though XD. I'm running 60% fan in games and I haven't seen the card come close to 80C. The card is also quieter and has a slower/not so vacuum sounding fan when compared to my 4870's fan; it's more of a whoosh.

Nope, one of the first things I started monitoring every now and then. I haven't seen a single temp change really, but I'll keep monitoring it. I think after removing that grommet, pretty much all the hot air gets pushed out the back of the case anyway. Also, the Antec 900 has fantastic airflow, even with low speed fans.)

thank you Mr.K6
 
thank you Mr.K6
Glad to help :cool:

After some more testing last night, I think 675/1476/1242 are going to be my final clocks, as I haven't had a hiccup with them yet. Considering it's two PCBs running so close together, I'm impressed with the overclocking capabilities of the card(s).

I also found out that the culprit for the card NOT downclocking after gaming was Rivatuner. It seems that for some reason, once an app goes full screen Rivatuner messes with 3D/2D detection. Anyway, since I'm at stable clocks I decided to do my mods to the BIOSes and get rid of Rivatuner completely.

As far as BIOS flashing goes, it was very easy. Actually, it was incredibly easy. My thanks to Johan (aka Mavke) at mvktech.net for providing me with the newest Nibitor. It read the BIOSes and modded them without a single issue. I ended up adding in the clocks of 675MHz core, 1476MHz, and 1242MHz memory. I also edited the fan control to have a minimum of 30% and to ramp up the fan sooner but over a longer range to keep the noise low but the card cool.

Anyway, everything is completely automatic now, no programs, just all through the BIOS. Ramping up and down, 3D/2D detection, everything works very well now. Time to test some more games :D.
 
Hey K6,
Nice card, great thread
Thanks :D

Just an update of what I've been seeing. Everything has been going great with this card. After updating the BIOS, having everything be automatic is extremely convenient, I'm very happy with the results. I also had to change my card clocks again. The memory just wasn't completely stable at 1242MHz, and after ~2 hours of gaming I would get the occasional glitch or driver recover. I backed the clocks down to 1224MHz and everything has been good since, so fingers crossed :D. Also a nice update - after patching Far Cry to v1.02, the game runs even better. I'm playing at 2560x1600, DX10, every option at the highest setting, with 4xAA. I'm getting on average 60-70FPS, with highs in the 120s and lows of ~45. Absolutely amazing performance from this card.
 
A quick update for anyone else with this card: Crysis now runs like butter!

I was doing a bit of googling and it bit-tech reported in their GTX 295 review that Crytek told them how to fix the problem: r_DynTexMaxSize=130

http://www.bit-tech.net/hardware/2009/01/08/nvidia-geforce-gtx-295-quad-sli-review/9

Setting this in the console makes Crysis run smooth as butter now, everything very high, 2560x1600, 2x edge AA, 30FPS. It's absolutely gorgeous and there is absolutely no stuttering.

Anyway, back to playing :D.
 
Congrats! So you are running this with the 620HX? That is what I have and I just got in line for the step up to the 295GTX - sounds like I am making the right decision!
 
Temps - This thing runs hot, no way around it. Using 100% fan, Furmark was bringing the core temps to 87C and 84C. GPU1 is ~3C lower than GPU0 consistently. 100% fan isn't that bothersome, with headphones I could probably game with it (although I won't).

Those temps are with the overclock right? What are your temps at when running at stock
settings in crysis warhead? I'm sure you got adequate case flow but what case fans are you running, and how
hot is it in the room your testing this card in?

Do you have watt meter so you can tell us how many watts this thing pulls from the wall
at load and idle? Can you test grid, cod5, and warhead...and comment on the overrall
gaming experience. Congrats and nice card / have fun.
 
Congrats! So you are running this with the 620HX? That is what I have and I just got in line for the step up to the 295GTX - sounds like I am making the right decision!
That I am :D. My CPU and RAM overclocks were stable before with my 4870, and I haven't noticed any problems yet.
Thanks for the info! I was thinking about picking two of these up for quad-SLI.
No problem :cool:, and wow, that'd be quite a purchase :eek:
Those temps are with the overclock right? What are your temps at when running at stock
settings in crysis warhead? I'm sure you got adequate case flow but what case fans are you running, and how
hot is it in the room your testing this card in?

Do you have watt meter so you can tell us how many watts this thing pulls from the wall
at load and idle? Can you test grid, cod5, and warhead...and comment on the overrall
gaming experience. Congrats and nice card / have fun.
Thanks, the temps you quoted were from testing my overclocks using Furmark. Really, the whole purpose of using Furmark there is to heat up the card to get more accurate overclocking results. The card doesn't get nearly that hot in actual gaming. For example, playing Crysis Warhead, which is probably as intensive as it gets, the GPUs were hovering between 75-80C with ~65-70% fan (due to fan speed being relative to GPU temp). I use an Antec 900 case with 88CFM Yate Loons running at <5V (so that they're silent). I also have the big 200mm fan on "low." Room temp is normal room temperature, probably 21-24C (70-75F).

I don't have a watt meter unfortunately, but like I don't think it's anywhere near enough to tax my 620HX. I don't have GRID or CoD5 unfortunately (might pick up CoD 5 later though), but I have played the following:

Left 4 Dead - 2560x1600, 16xQ CSAA, everything maxed, 120FPS+ (completely CPU limited), runs perfect
Far Cry 2 - had some issues until I updated to v1.02, now 2560x1600, everything as high as it goes (ultra high, etc.), 4xAA, probably 50-60FPS average, lows of ~40, highs of 120+
Crysis Warhead - 2560x1600, everything on Enthusiast, 2x Edge AA (really brings out the foliage), ~30FPS average, lows of 20, highs of 60+. However the game still has driver problems. One issue I had was loading the "Frozen Paradise" level, at about 80%, the load bar stopped, and took ~ 5 minutes to load the last 20%. When the level finally loaded, FPS was ~4-5 :confused: :eek:. Tried reloading, same long load time, same issue. Closed the game, re-opened it, reloaded the level, no problem, whole level loads in <30 sec and frames are back in the 30FPS. Go figure.
Fallout 3 - 2560x1600, everything maxed, 8xAA, 60FPS+ always.

My overall gaming experience with this card has been fantastic, I'm very pleased with it. Let me know if you want more information or details.
 
Thanks, the results seem impressive, and those numbers are only going to get better as
drivers mature. Temps seem very good. My quarrel would only be the possible loud fan noise from that card as you seem to have to crank up the fan. Definitely the new king in town. ATI has 9.1 drivers slated this week, but
the gap I'm seeing from graphs / review sites...coupled with users from this forum is way
too big for a driver update for the 4870x2 to get ahead. Hopefully this will result in price wars aka. cheap 4870x2s and cheaper gtx295s in the coming months ahead. :)
 
Thanks, the results seem impressive, and those numbers are only going to get better as
drivers mature. Temps seem very good. My quarrel would only be the possible loud fan noise from that card as you seem to have to crank up the fan. Definitely the new king in town. ATI has 9.1 drivers slated this week, but
the gap I'm seeing from graphs / review sites...coupled with users from this forum is way
too big for a driver update for the 4870x2 to get ahead. Hopefully this will result in price wars aka. cheap 4870x2s and cheaper gtx295s in the coming months ahead. :)
Great products from both companies just makes everything better for us consumers :D. As far as the fan goes, it's got one of the best "tones" or "sonic signatures" I've heard in awhile. Like I said before, it's mostly a "whoosh" rather than a vacuum cleaner sound. It's not that loud and even when it's cranked it's only white noise while wearing a headset.
 
Wait wait what you're running a GTX295 on a HX620!?

Wtf I have that and they told me a GTX 280 will be coming close to maxing out the PSU...

I thought minimum for 295 was 680W?
 
A quick update for anyone else with this card: Crysis now runs like butter!

I was doing a bit of googling and it bit-tech reported in their GTX 295 review that Crytek told them how to fix the problem: r_DynTexMaxSize=130

http://www.bit-tech.net/hardware/2009/01/08/nvidia-geforce-gtx-295-quad-sli-review/9

Setting this in the console makes Crysis run smooth as butter now, everything very high, 2560x1600, 2x edge AA, 30FPS. It's absolutely gorgeous and there is absolutely no stuttering.

Anyway, back to playing :D.

I did post that same info with the same source to you...

http://www.hardforum.com/showpost.php?p=1033580502&postcount=101
 
Back
Top