XFX 5870: My Impressions

Mr. K6

Supreme [H]ardness
Joined
Mar 23, 2005
Messages
5,077
UPDATE: Back in full swing, baby :D

So I finished up my work and exams for the week, and finally got home to tinker with my new toy, and XFX 5870 fresh from NCIXUS :cool:. I enjoy reading user experiences with new hardware on the forum, so, since I picked up a 5870 on launch, I thought I'd return the favor and post mine. I'll try to post information beyond what you'll find in official reviews from enthusiast websites, so I'll include impressions, thoughts, personal opinions, and more. I'm currently installing Crysis (oh you knew that was coming :D), so I'll throw up my initial impressions and installation experiences as well as some pics:


OK, so going through the pictures. First, the box is HUGE. Easily the biggest retail video card box I've had. Measures ~14"x10"x4.5". XFX really has a nice presentation with its packaging. The card is well packed and surrounded by foam, included with the card is installation literature, a driver CD, a DIRT 2 coupon (not really a fan, probably will sell it), a "Do Not Disturb" gag doorknob hanger, 2 PCIe 6-pin connectors, Crossfire bridge, and a DVI->VGA connector. The card itself has a very different feel to it both literally and aesthetically - this full shroud is a big leap for ATI. The card itself is not as heavy as I would have thought (feels lighter than the 4870, IIRC, but maybe it's because the weight is distributed over a larger area), and the plastic really is enclosing a lot of air. I know there was a lot of talk about the small vent - well, a ha, no worries. If you look at the quick picture I took (excuse the quality), the heatsink actually ends a good inch before the vent. So you really have a pocket of air back there, and as air comes through the fins it can either exit through that vent in the back OR through the vent on the side in the back.

Installed, there's not much to say. You can see by the pics it is as long as it's been reported, and overhangs my mATX board by about an inch. Also note there's pretty much zero clearance with the DIMM slots, as well as the SATA ports in the front. But it still fits, so it works for me :). The V351 made wiring this bugger a bit difficult, but I was still able to hide most of the wires and it's snug, but there's still plenty of room. I was happy I didn't have to take out the 120mm fan :).

I grabbed the latest drivers off the ATI website and they installed without issue. One thing that's annoying me: where the hell is color saturation in these new drivers? I thought maybe my HD4350 wasn't reading the 9.9's correctly, so I installed 9.4 to get that option back. Now I don't see it on these new drivers with the 5870 - where did ATI put that option? Anyway, currently idles at 35C with 21% fan speed, which is ridiculous. The fan is incredibly quiet, I can't hear it next to my 5V Yate Loons. Also, power consumption of my entire PC went up a whole 10W at idle (according to my Kill-a-Watt) after moving from the HD4350 to the 5870 - just plain ridiculous.

That's it for now, I'll keep posting through the night as I test the card. Next will be probably be some overclocking and then gaming impressions :). Also, post your requests and questions, I'd be happy to help. If you want some pictures or information, or are really dying to see some game benchmarked, let me know :).

Cheers,
Mr. K6
 
Last edited:
Which driver are you using now? From what I hear the one on the CD's with cards is a modded 9.9 driver for 5870, how are you using 9.4, and why? The Cat 9.10 driver (release 8.66 RC6) press received, is based on Cat 9.10 and supposedly has performance improvements over 9.9. I hope AMD releases these quickly and gets them out there to you guys, but it won't be until at least October sadly.
 
Which driver are you using now? From what I hear the one on the CD's with cards is a modded 9.9 driver for 5870, how are you using 9.4, and why? The Cat 9.10 driver (release 8.66 RC6) press received, is based on Cat 9.10 and supposedly has performance improvements over 9.9. I hope AMD releases these quickly and gets them out there to you guys, but it won't be until at least October sadly.

Looks like MSI has released the 9.10 8.660 RC7 according to the rage3d forums:

http://www.rage3d.com/board/showthread.php?t=33952980

Direct links to the drivers:

Vista/Windows 7

http://download1.msi.com/files/downloads/dvr_exe/ATI_8.66RC7_Win7_Vista.zip
or
http://download2.msi.com/files/downloads/dvr_exe/ATI_8.66RC7_Win7_Vista.zip

XP
http://download1.msi.com/files/downloads/dvr_exe/ATI_8.66RC7_XP.zip
or
http://download2.msi.com/files/downloads/dvr_exe/ATI_8.66RC7_XP.zip
 
Last edited:
Wow, I appreciate everyone's enthusiasm, but it looks like the review stops here :(. I think I got a bad card. After a couple of reboots in between game and app installs, the memory started corrupting on the desktop (standard multicolored checkerboard pattern). God damn it, it figures. It seems it only does it when the card clocks down into to 2D mode (RAM can't handle the low speed?). I've been troubleshooting for about the last two hours with no luck. I guess I'll call up XFX support and see how great their 5 star customer support is (it says so on the box :D).

Ah well, I'll update it with the status and go from there. We'll put this on a hiatus for now.
 
I hope you get a quick RMA. It may be quicker sending it back to NCIXUS.
 
I hope you get a quick RMA. It may be quicker sending it back to NCIXUS.
I considered the exact same thing and if it was Newegg, I'd try them first. However, they're in Vancouver, Canada and I'm in MA. Also, their phones aren't open until Monday, XFX's are tomorrow (already sent in a support ticket so they have the info when I call tomorrow). I'm hoping for a quick turn around/advanced replacement to see if their support really is all it's cracked up to be. It'll be a learning experience :).
What drivers were you using?
The one's dated Sept. 21, 2009 on the ati.amd.com website and the one's from the CD. Is it a known driver issue? I'm googling atm.
 
The one's dated Sept. 21, 2009 on the ati.amd.com website and the one's from the CD. Is it a known driver issue? I'm googling atm.

Don't know if it is or not, but it can't hurt to try the newer ones. Some initial driver releases can be a little quirky. I know when I had my 3870X2 it took them a few tries to get things settled. Especially when simultaneously dealing with a new hardware release.
 
Last edited:
Don't know if it is or not, but it can't hurt to try the newer ones. Some initial driver releases can be a little quirky. I know when I had my 3870X2 it took them a few tries to get things settled. Especially when simultaneously dealing with a new hardware release.
Good point, worth a try.
Woke up to take a leak and thought I'd check the thread :D. I'm downloading the new driver now and I'll re-install the card in the morning and hope for the best :cool:.
 
Lols. No chance the X2 will fit in the V350 I guess. Hope this "newer tech = bigger cards" trend falls off somewhere.
 
Sorry to hear it, I bought mine from the Cdn. side. I'm using the cd drivers on XP32, and the RC6 from AMD on Win7RC. I haven't noticed the issue you're having though. I'm using the top DVI connector. Smooth sailing so far :)
 
I looked at your photos.

Before you give up, I'd check the card temps, you don't have much space for the card to breathe.
Maybe take out that fan, or run it outside the case to check on artifacts.

Also, look for some other drivers. I bought the 4870 X2 when it was released and had similar experiences............the thing was shit with the release drivers............I spent several hours searching out drivers and finally found a beta that cured my ills.

I hate to say it, but these new releases from ATI can sometimes challenge your patience.

Chin up man.:D
 
WOOOOOOOHOOOOOOOOOOOOOOOOOoooooo! It was a driver issue. I'm not sure if it was due to the newer driver fixing it, or the fact that I went through and thoroughly purged my system of old AMD drivers (used Driver Sweeper and manual searching), but she's back to working perfectly. Of course, now everything says MSI on it, but w/e :p. And there's still no damn saturation option in the drivers :(. Anyway, we're back on schedule for a full day of benchmarking, gentlemen :D. I appreciate all the help and support :).
Lols. No chance the X2 will fit in the V350 I guess. Hope this "newer tech = bigger cards" trend falls off somewhere.
Did the 4870X2 though? I mean, in it's defense, it is a small form factor case.
I looked at your photos.

Before you give up, I'd check the card temps, you don't have much space for the card to breathe.
Maybe take out that fan, or run it outside the case to check on artifacts.

Also, look for some other drivers. I bought the 4870 X2 when it was released and had similar experiences............the thing was shit with the release drivers............I spent several hours searching out drivers and finally found a beta that cured my ills.

I hate to say it, but these new releases from ATI can sometimes challenge your patience.

Chin up man.:D
Temps aren't a problem, I'm idling at 35-36C @ 21% fan speed. I haven't even put her into 3D yet =o. Like I said, I don't know if it's ATI's drivers or my screwed up computer. I guess now is a good time to put in this preface - I'm running a new system (core i5 based) off the Vista install of my old system (Q6600 based). I don't know how drastically this effects performance, but I guess glitches will happen if I don't make sure everything is clean as a whistle.

That said, I'm rebooting to double check that it still works afterwards, and I'm downloading AMD clock tool to start clocking this beast :).

EDIT: Rebooted fine, starting up Furmark to get her hot. I've been reading on XtremeSystems that due to the new memory error correction algorithm, one has to watch for the FPS when clocking the RAM, rather than for artifacts. Should be interesting :).
 
Last edited:
Overclocking testing:

I'm just going to log things as I see them:

Furmark is loading the card at stock speeds (850/1200) at 86C and at 40% fan speed. To be honest, this is downright amazing. I haven't owned a video card from either camp that could weather Furmark this well. I'm very excited to start clocking.


EDIT 1: It hardlocked at 920MHz on the core. However, the fan was only at 44% and the temps were bout 86-87C. That said, it looks like a voltage limitation vs. a cooling. That makes this card an overclocker's dream come true; I can't wait for the new Rivatuner :D. For the remainder of testing, I'll be leaving the card at 900MHz core for a stable gaming clock. Onward to the memory :).
 
Last edited:
FurMark, at least in my testing, is pretty much the hottest/most power usage you will ever get on a video card. In typical gaming, temp/power is less than what you will see in Furmark. Furmark is like taking the card to the maximum, but in gaming you'll really never pull the same temp/power continuously.
 
FurMark, at least in my testing, is pretty much the hottest/most power usage you will ever get on a video card. In typical gaming, temp/power is less than what you will see in Furmark. Furmark is like taking the card to the maximum, but in gaming you'll really never pull the same temp/power continuously.
That's my experience too. I use to overclock based on the assumption that if it's stable in Furmark, it'll be stable in pretty much anything else. It usually saves me from crashing once I get to game testing and having to go back and retry different clocks. That said, the great temps I'm getting are even more promising.
 
We use Furmark too for overclocking, like you said, if its stable there, it is stable everywhere. And you are right, now you have to watch the framerate on overclocking memory, and the point right before the framerate decreases is your highest stable overclock. This is only for the memory though, doesn't affect the core clock.
 
We use Furmark too for overclocking, like you said, if its stable there, it is stable everywhere. And you are right, now you have to watch the framerate on overclocking memory, and the point right before the framerate decreases is your highest stable overclock. This is only for the memory though, doesn't affect the core clock.
That's exactly what I noticed, although it's tougher to tell that I would have thought. 1310MHz on the RAM hardlocked, so I know that's the upper limit. 1300MHz was running fine, but not noticeably better than 1275MHz, which could be the nature of the beast, or could be error-checking kicking in. One thing I did notice is that the core ran cooler (81-82C as opposed to ~85-86C) at 1300MHz during the Furmark benchmark. I'm wondering if it would be better to do Crysis benchmark runs to get a quantitative result (with a couple decimal places) to compare the performance of different memory clocks. I'm going to try it now.
 
So I forgot to save the 1.2 Crysis Patch on my computer, and I'm downloading that now. While waiting, I found the saturation option :D. You have to right click on the display itself in the "Desktop and Displays" section, and then select configure. It's there under AVIVO color. Man, everything so bright and happy now :D.
 
OK, first gaming update. I'd like to preface all of these results with a couple of caveats, especially if one is looking for comparisons. My last gaming config was a Q6600 @ 3.6GHz, DDR2 @ 1080MHz, 5-5-5-15 timings, and my GTX295 at various overclocks. I'm currently using my new rig, which is Core i5 based. One of the problems with it is that I'm still stuck using the stock intel cooler, which limits my overclock. Still, at stock voltage (ish, the Gigabyte BIOS won't read my VID correctly), I made it to 3.4GHz stable. The specs I'm running are: Core i5 750 @ 3.4GHz, G.Skill PC16000 @ 2000MHz 8-8-7-21, and the 5870 @ 900/1275.

So, Crysis :D

I'm running DX10 32-bit, 2560x1600, everything on very high, with 2x Edge AA enabled. Overall, framerates are comparable to my overclocked GTX295. Generally 20-30FPS throughout the game, ususally in the 25-27FPS area. If I had to point out any differences, I'd say the 5870 is smoother. Now this might be due to several factors, like the new CPU and RAM, but generally minimum FPS is not nearly as low as I remember from GTX295, but maximum FPS's aren't as high either. For instance, looking straight up at the sky used to get me over 60FPS, now it's at ~45FPS. There's a lot less jerky/loading motion for sure. Temps are great. The card's loading at 77C and only 31% fan speed. That's absolutely fantastic. I think I'll try out Warhead and see if it's a better comparison.
 
Thanks for sharing your experiences! I always enjoy reading them.


Crysis with AA on very high on 2560x1600? That's very impressive! That must look awesome, is it perhaps possible to show us a screenshot how it looks ingame? You don't have to do this if you don't want to.


Congratulations on your new card and I hope you have a great time with it.:)



I'm probably going for a HD 5850 when I buy a new computer, I hope it will be able to play Crysis at "Enthusiast" without AA at 1920x1200. But depending on how things go regarding my budget, I might try to get a HD 5870.
 
Last edited:
We use Furmark too for overclocking, like you said, if its stable there, it is stable everywhere. And you are right, now you have to watch the framerate on overclocking memory, and the point right before the framerate decreases is your highest stable overclock. This is only for the memory though, doesn't affect the core clock.

How long do you have to let Furmark run to know it will be stable in games?
I ran Furmark for an hour with my 9600 GSO 512mb overclocked to 650/1600 stable. But when I actually played games it crashed out. Found a stable gaming setting at a more conservative 600/1500 settings. And obviously Furmark also passes that lower setting.
 
Thanks for sharing your experiences! I always enjoy reading them.

Crysis with AA on very high on 2560x1600? That's very impressive! That must look awesome, is it perhaps possible to show us a screenshot how it looks ingame? You don't have to do this if you don't want to.

Congratulations on your new card and I hope you have a great time with it.:)

I'm probably going for a HD 5850 when I buy a new computer, I hope it will be able to play Crysis at "Enthusiast" without AA at 1920x1200. But depending on how things go regarding my budget, I might try to get a HD 5870.
I'm happy to share :). I think it's very impressive as well, considering it's only a single GPU. Ask and you shall receive: here's a couple of screenshots from the "Relic" level (I think it's called "Recovery" in-game) :D.


Your question of the 5850 piqued my curiosity. For the hell of it, I also "mimicked" a 5850 by dialing down the clocks to a 725MHz core and 1000MHz RAM. Granted this isn't a direct comparison, due to the 5850 having one less shader cluster, but how different can it be? Anyway, I used the same settings: everything on Very High, 2x Edge AA, @ 1920x1200 using 5850 clocks and I was getting ~5-8FPS more on average. So I'd say you'll be running 30-35FPS no sweat, even higher if you don't use edge AA (I love this setting, makes the vegetation look so much more real). For those of you at 1920x1200 and want the 5870, I also tested my current overclock of 900/1280 and I was getting 35-42 FPS or so. The 5870 is definitely a 1920x1200 killer.

I also tested some more in Warhead, with pretty much the same results. I want to say the FPS is comparable,maybe a little bit lower than my overclocked GTX295 (~95% or so). However, it's SOOOOOOO much smoother. Honestly, it feels like playing at 60FPS when I'm at 25FPS. Another interesting thing I noticed is Crysis quick loads are instantaneous. I've never had that happen before, and it's quite amazing.

More to come :).
 
Thank you for showing us your resluts. By Monday, I'll have my Asus 5870 and a nice 26" monitor arriving, and this thread is making me all the more excited. My big question is how much of a real-life difference voltage tweaking will bring. My goal is to get the core to 1000MHz and 5200(1300*4) and see if this card is really bandwith starved as some have speculated. Have fun with your new card, and please keep us updated.
 
Thank you for showing us your resluts. By Monday, I'll have my Asus 5870 and a nice 26" monitor arriving, and this thread is making me all the more excited. My big question is how much of a real-life difference voltage tweaking will bring. My goal is to get the core to 1000MHz and 5200(1300*4) and see if this card is really bandwith starved as some have speculated. Have fun with your new card, and please keep us updated.

My XFX is coming in Monday too.........can I ask which 26" monitor are you getting....did you review all current monitors? Thxs...Mark




.
 
The monitor is actually 25.5", but I didn't want to be bothered with two extra keystrokes. I didn't really shop around all that much and chose this because I was already at Newgg ordering the 5870, and my cousin bought one about 3 weeks ago and has been very happy with it. It's a TN screen, but the lower quality compared to the high end monitors doesn't bother me. Maybe if I got into photography or video work would it not be good enough, but for gaming and general work, it's more than adequate.

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236047
 
Quick update - Tested some L4D - no surprise it handles it fine :p. Using 2560x1600, everything cranked, 8x MSAA and 16x AF, getting 80-160FPS depending on the area. However it's SMOOOOTH. Markedly different experience when compared to playing with the GTX295. I don't know if it's how the game handles multi-GPU, or where the issue is, but there's a major difference in the "feel" of the game.
 
Quick update - Tested some L4D - no surprise it handles it fine :p. Using 2560x1600, everything cranked, 8x MSAA and 16x AF, getting 80-160FPS depending on the area. However it's SMOOOOTH. Markedly different experience when compared to playing with the GTX295. I don't know if it's how the game handles multi-GPU, or where the issue is, but there's a major difference in the "feel" of the game.

Wait... so you own both a GTX295 and a Radeon 5870, and you'd recommend the 5870 over the 295 for 30" gaming?
 
Quick update - Tested some L4D - no surprise it handles it fine :p. Using 2560x1600, everything cranked, 8x MSAA and 16x AF, getting 80-160FPS depending on the area. However it's SMOOOOTH. Markedly different experience when compared to playing with the GTX295. I don't know if it's how the game handles multi-GPU, or where the issue is, but there's a major difference in the "feel" of the game.

hehe, guess I did the right thing to sell GTX 295..

btw, when you test Crysis, did you use 64bit binary?
 
Wait... so you own both a GTX295 and a Radeon 5870, and you'd recommend the 5870 over the 295 for 30" gaming?
Well, more so owned a GTX295 (sold it to buy the 5870). Like I said earlier, I don't know if it's proper to use my results to compare the 5870 to the GTX295 because my hardware has so drastically changed. I mean, I can DEFINITELY recommend a 5870 + i5 over a GTX295 + Q6600, but which upgrades are adding to which parts of the experience is more difficult to determine. I'd like to say that the 5870 contributes mostly to the smoothness, just from past experience with single and multi-GPU set-ups. However, other things, like Crysis quick-loading instantaneously, might be more due to my RAM running at 2000MHz 8-8-7-21 (compared to 1080MHz 5-5-5-15) rather than any video card change.
hehe, guess I did the right thing to sell GTX 295..
btw, when you test Crysis, did you use 64bit binary?
Not yet; I never really saw any appreciable difference between the two (sometimes even a performance loss when going to 64-bit). However, that doesn't mean that's the case now, so let me fire it up :D. I'm also installing STALKER: CS and Fallout 3, so those will both be coming soon :).
 
Good to know it was a driver issue.;)

Sounds like you have a very nice deal there.....please let me know if you ever try out Eyefinity, I am very interested in pursuing a 5870- X2 and three 24" Dell monitors......when the 5870 X2 is released.
 
I'm happy to share :). I think it's very impressive as well, considering it's only a single GPU. Ask and you shall receive: here's a couple of screenshots from the "Relic" level (I think it's called "Recovery" in-game) :D.


Your question of the 5850 piqued my curiosity. For the hell of it, I also "mimicked" a 5850 by dialing down the clocks to a 725MHz core and 1000MHz RAM. Granted this isn't a direct comparison, due to the 5850 having one less shader cluster, but how different can it be? Anyway, I used the same settings: everything on Very High, 2x Edge AA, @ 1920x1200 using 5850 clocks and I was getting ~5-8FPS more on average. So I'd say you'll be running 30-35FPS no sweat, even higher if you don't use edge AA (I love this setting, makes the vegetation look so much more real). For those of you at 1920x1200 and want the 5870, I also tested my current overclock of 900/1280 and I was getting 35-42 FPS or so. The 5870 is definitely a 1920x1200 killer.

I also tested some more in Warhead, with pretty much the same results. I want to say the FPS is comparable,maybe a little bit lower than my overclocked GTX295 (~95% or so). However, it's SOOOOOOO much smoother. Honestly, it feels like playing at 60FPS when I'm at 25FPS. Another interesting thing I noticed is Crysis quick loads are instantaneous. I've never had that happen before, and it's quite amazing.

More to come :).



Thank you for the screenshots and detailed reply. :)


The screenshots look really beautiful, and I'm very happy to hear about the performance you got with a lower-clocked HD 5870 on 1920 x 1200. Even though the HD 5850 has, like you said, one shader cluster less it makes me very hopeful about its performance. I barely use AA, so I'm quite excited at the moment that the HD 5850 also can get good framerates even at "Enthusiast" settings.


I'm also really happy to read about the smoothness you experience in the game, the HD 5870 is really a beast of a card.


Thanks again. :)
 
Quick update - Tested some L4D - no surprise it handles it fine :p. Using 2560x1600, everything cranked, 8x MSAA and 16x AF, getting 80-160FPS depending on the area. However it's SMOOOOTH. Markedly different experience when compared to playing with the GTX295. I don't know if it's how the game handles multi-GPU, or where the issue is, but there's a major difference in the "feel" of the game.

Agreed. L4D is much smoother than with my 4870X2. I cant believe it. I'm sure the X2's crossfire was working, but the 5870 give a much better experience at 30" with everything cranked.
 
FurMark, at least in my testing, is pretty much the hottest/most power usage you will ever get on a video card. In typical gaming, temp/power is less than what you will see in Furmark. Furmark is like taking the card to the maximum, but in gaming you'll really never pull the same temp/power continuously.


completely agree.. my card never exceeds 55C in gaming (8800GT 760/1890/1015 w/ duorb cooler) but will easily break 80C in furmark within the first 10 minutes before it finally hard locks.. the programs a beast but great for stress testing..
 
Back
Top