AMD ATI Radeon HD 4870 X2 @ [H]

That fuzzy cube is more stressful than anything I've ever put on my X850. No idea how it preforms on multiGPU setups though. Maybe I should go download it and find out.

The fuzzy ATITool cube looks to do pretty well, while maintaining a minimum load on CPU. What I have used in the past is FurMark to do power testing. You can control res and MSAA from the control panel and in full screen mode it is showing to reach into both cores on the 4870x2. It will however load one CPU core fully in cycles, which should not be that big of a deal, but surely needs to be noted.
 
yay!!! <claps>

thank you, it will make my days surfing the video card forums so much more efficient now w/ out having to go through all the wasted bandwidth that B.C. created.

thanks again Kyle!

He had a short record here and already had multiple trolling / personal attack infractions. Now let's stay on topic please.
 
So from what I'm seeing, I should save the cash, get a 280, and have at least the same performance as a 4870x2?
 
So from what I'm seeing, I should save the cash, get a 280, and have at least the same performance as a 4870x2?

Not at all. The 4870 X2 is a faster card with rare exceptions. Crysis runs a little better on the NVIDIA cards for example. The 4870 X2 also can handle 8xAA with almost no reduction in performance compared to 4xAA which is something the NVIDIA cards do not do.

Take Call of Duty 4 for example. These are the results I got.

ATI Radeon 4870 X2

CoD4 Results: 2560x1600 4xAA, 16xAF, V-Sync Disabled

Minimum 45
Maximum 173
Average 83.018


CoD4 Results: 2560x1600 8xAA (Adaptive), 16xAF, V-Sync Disabled

Minimum 42
Maximum 164
Average 82.471


Geforce GTX 280 OC

CoD4 Single Player Bog Results: 2560x1600 4xAA, 16xAF, V-Sync Disabled

Minimum 35
Maximum 147
Average 63.468


Take these results with a grain of salt of course.
 
So from what I'm seeing, I should save the cash, get a 280, and have at least the same performance as a 4870x2?

I wrote this line of the evaluation;

The fact is that gamers have never had a wider range of choices in quality 3D gaming video cards that span the $150 to $450 price points. One thing we hate to do at HardOCP is ride the fence, but after accounting for your video card budget, it is hard to make a bad choice currently.

And I think it is about 100% true. I do not think the GTX 280 is a bad choice in video cards. In fact, it is likely a good choice for future games that are shader heavy.
 
OK. On GPU power we will no longer use games. I do not like the results shown above and we need to do a better job of rendering a static load across our cards for power consumption. This is one of those areas where real world testing falls short in applying a metric.

We will use FurMark unless we find some other reasons to not use it. (This is what I was previously using here to power test cards but ran into a snag with the 4870x2 preview and moved away from it but I don't recall why at the moment. Brent has just moved out of state, so I no longer have easy access to the bevy of cards tested and he will have to take on this task himself now due to his geographic location.)

Here are a couple of things I cam up with this morning on 4870x2 power testing;
  • The "GPU Activity" gauge in CCC is basically worthless for 4870x2. It will read ~100% when the card is not using both GPUs. Maybe it is reading one GPU and assuming an evenly distributed load.
  • The 4870x2 does NOT scale across both GPUs when the rendered scene is in a window. ATI has previously stated this.
  • When 3D is in a window, it seems that the 4870x2 uses it 2D clock settings.
  • ATI Tool seems a bit worthless in stressing a 4870x2 for the above reasons, even though it shows ~100% GPU Activity in CCC.

All of this information was gleaned from my personal 4870x2 card I have in my box this morning.

Setting MSAA at any level in FurMark seems to choke the shader process of the CPU and bring down the overall wattage. The default 1280 resolution setting at full screen seems to generate the same wattage load you would at 2560 (obviously the shaders are loaded on this card). But it would seem prudent to run the FurMark at highest resolution possible.'

If you care to share any of your thoughts I am certainly listening as we need a better defined process than what it has been.
 
Brent or Kyle, I guess it's better directed @ Brent since he did the testing.

Did you find any issues w/ "power play?" ie. downclocking of 1 of the cores?

I noticed in the SS's on page 8 that "current clock settings" were 507 and 500. (http://enthusiast.hardocp.com/article.html?art=MTU0OSw4LCxoZW50aHVzaWFzdA==)

The reason why I ask, I can't seem to get both of my cores to be running @ the stock clocks of 750 & 900, and that's w/ cross fire and w/ out?
 
Brent or Kyle, I guess it's better directed @ Brent since he did the testing.

Did you find any issues w/ "power play?" ie. downclocking of 1 of the cores?

I noticed in the SS's on page 8 that "current clock settings" were 507 and 500. (http://enthusiast.hardocp.com/article.html?art=MTU0OSw4LCxoZW50aHVzaWFzdA==)

The reason why I ask, I can't seem to get both of my cores to be running @ the stock clocks of 750 & 900, and that's w/ cross fire and w/ out?

Mine stay at 507/500 when I am in 2D or windowed 3D. How exactly are you trying to load the GPUs to get them to scale? Also, you say "both of my cores," are you monitoring them separately, if so how?
 
And to be clear, mine DO scale to 750/900 when in fullscreen 3D. You can see this if you have your desktop extended to a second monitor.
 
Brent or Kyle, I guess it's better directed @ Brent since he did the testing.

Did you find any issues w/ "power play?" ie. downclocking of 1 of the cores?

I noticed in the SS's on page 8 that "current clock settings" were 507 and 500. (http://enthusiast.hardocp.com/article.html?art=MTU0OSw4LCxoZW50aHVzaWFzdA==)

The reason why I ask, I can't seem to get both of my cores to be running @ the stock clocks of 750 & 900, and that's w/ cross fire and w/ out?

I was seeing this as well after I'd first installed: the clocks would at times seem to stay at 507/500 even when running a 3d app. I noticed this when I was getting sporatic low FPS and scores in Vantage, and monitoring ATITool. It was strange: sometimes I would get full throttle, and sometimes the clocks would stay 2d. I'd restart, and get a different result.

The official 8.8 drivers seemed to have solved this for me.
 
I was seeing this as well after I'd first installed: the clocks would at times seem to stay at 507/500 even when running a 3d app. I noticed this when I was getting sporatic low FPS and scores in Vantage, and monitoring ATITool. It was strange: sometimes I would get full throttle, and sometimes the clocks would stay 2d. I'd restart, and get a different result.

The official 8.8 drivers seemed to have solved this for me.

I'm wondering if that's the issue I'm having. (Sporadic low performance.) Though the 8.8 Release drivers haven't solved that for me.
 
ie. I run 3dmark vantage & 06 just to test to see if both gpus are running @ stock clocks and I get sporadic gpu scores 14K to 19K in both 06 and Vantage

btw I am using official 8.8's as well

also, I've tried this w/ xp 64 and vista 64, going to try xp 32 tonight and see if I get the same downclocking issue
 
This might be a stupid qustion.... but... how does a single 4870x2 compare to (2) 4870's in crossfire mode? I would like to get a single 4870 now and upgrade in Jan or something like that. Will the performance be similar?

-JRW
 
hmm, I just run a test and then exit out, just to check to see if the clocks have changed.

I guess I can run a 2nd monitor to make sure.

Thanks for the suggestion, I'm not too familiar w/ ATi/CCC and coming from Nvidia/Evga precision I could see where the clock speed down clocks and then goes to the stock clocks/overclocks when 3d mode is initiated.

I can see where it might change too quickly to catch with a single monitor and remember, they will not change with 3D in a window currently from my testing.
 
This might be a stupid qustion.... but... how does a single 4870x2 compare to (2) 4870's in crossfire mode? I would like to get a single 4870 now and upgrade in Jan or something like that. Will the performance be similar?

-JRW

H hasn't done it yet that I know of but other sites put it on par until you get to the higher resolutions and AA. It seems that there is a slight performance hit with the 1gb until then. if your less then a 30" monitor I would consider them pretty well equal.
 
Stuipd question, but have you perchance tried opening up 2 windows of ATItool? It should be obvious from powerdraw or the FPS counter if they are loading to the 2 different cores.

Been there, tried that. I don't see any way to run two instances easily, but I have not researched it either. But again ATI has told me directly that 3D in a window will NOT scale across two cores.
 
This might be a stupid qustion.... but... how does a single 4870x2 compare to (2) 4870's in crossfire mode? I would like to get a single 4870 now and upgrade in Jan or something like that. Will the performance be similar?

-JRW


That is up next. Tell Brent to hurry up!

Big deal there is the 4870 not having 1GB of RAM. Given that "special" builds are coming out with 1GB, I would suspect them to be very much on par with each other at high resolutions with AA.
 
I'm not sure what the average gamer has for monitors but I have two HP w2207h monitors that have a native resolution of 1680x1050. I'd like to see more reviews that have lower resolutions and what card I could buy that would give me the most bang for my buck with upcoming titles like Far Cry 2. I plan on upgrading my entire system when the Core I7 processor hits the $500 mark.
 
I'm not sure what the average gamer has for monitors but I have two HP w2207h monitors that have a native resolution of 1680x1050. I'd like to see more reviews that have lower resolutions and what card I could buy that would give me the most bang for my buck with upcoming titles like Far Cry 2. I plan on upgrading my entire system when the Core I7 processor hits the $500 mark.
well the I7 is is supposed to be well under $300 for the 2.66. also for 1680x1050 the 4870x2 is basically overkill and you would be getting the same framerates at 1920 anyway.
 
Yeah, well the idea was it would run 1 core per window, not that is would scale 2 1/2 cores per window.

Yes, I get your idea. But typing it and doing it are two different things.
 
Mine will be here tommorow :p

Funny thing is I have a 680i motherboard, and still I wanted this over rehashed, last gen GTX260's in sli.
 
I found with my 4850 that AA set in game (HL2 in my case) felt like it dropped the frame rate so the game was not as smooth. But if I turned off "Use application settings" in the control panel and manually set it to 8x Edge detect then I got great AA with no performance hit.

When you tried AA in games like Crysis, did you do that in-game or the above method?
 
I found with my 4850 that AA set in game (HL2 in my case) felt like it dropped the frame rate so the game was not as smooth. But if I turned off "Use application settings" in the control panel and manually set it to 8x Edge detect then I got great AA with no performance hit.

When you tried AA in games like Crysis, did you do that in-game or the above method?

Apparently AA through CCC does not work with crysis, have to use in game AA options.
 
Why was the X2 tested vs a pre overclocked GTX? Maybe because they thought it was unfair to test a multi gpu vs a single gpu card, maybe because they wanted the GTX numbers to be higher than when just at stock?

I don't know, thats why I am asking..?

If you want overclocked comparisons it is only fair to OC both cards in equal percentages.

What would have made the review better, is if they did the tests with stock X2's and Stock GTX's and then did the tests with both cards overclocked, by say, 15% each, or 20% each. Or whatever..

So long as the overclocks where equal in percent it would have been fair.

But to benchmark a stock card vs an overclocked card is unfair.
Edit: Or maybe they could have added a stock GTX in the review to show side by side a GTX Stock, GTX OC and X2 Stock..

meh.

From other thread. Its a good question....
 
lol, I dont even understand where people got the notion that a 4870 x2 could match gtx 280 sli. Hell a single hd 4870 is equal to gtx 260 so unless sli scaling completely ****s up in a game, gtx 280 sli reigns supreme.

i lol'ed too, first, it wasn't mine, but there's too many people giving you the opinion that you started to giving it...well, a little credit. i thank [H] deeply for clearing the cloud.
 
Well I certainly hope 280 sli will outperform an X2 concidering the huge ammounts of cash invested in the setup....but....there are several instances where the X2 rips through 280 sli and thats not bad either :)
 
Nice review, but I would have liked to see the data for two 280's operating in SLI, -vs- the 4870X2 too.
 
Nice review, but I would have liked to see the data for two 280's operating in SLI, -vs- the 4870X2 too.

The numbers for the 280 GTX SLI: Min: About 20 Max: HOLY SHIT Average: About 50% higher than the 4870x2

Clearly, I'm talking out my ass, but I'm pretty close I bet ;) That transfers over to all games in the [H] arsenal. If the 4870x2 on it's own was trading blows with the 280 gtx, clearly the SLI would slaughter the X2.
 
That is the 'Special Featured Article' we need to see, done with the [H]'s style of review, and using current drivers for both... :)

The 4870X2 -vs- (2) 280's in SLI data, is frequently requested on the net, but hard to find a decent review.
 
From other thread. Its a good question....

Do you know how many GTX 280's come overclocked? Almost all of them. And you don't pay extra for it if you look in the right spot. So why take away a natural advantage?
 
Probably already said, by why no 280GTX SLI comparison... I mean you are comparing the flagship of ATI to the 2nd tier (SLI) of Nvidia.

Sorry but 4870x2 is 2 GPUs you need to compare it apples to apples.
 
I hope not.

Answer my question..



Reviews, especially a [H] review, are about objectivity and fairness, Testing an Oced Card vs a stock card is hardly that.

So, once again, why?

What is fair about neutering a card? You might as well have said "disable the extra shaders on the 6800GT when comparing it to the vanilla" back in the day. Like seriously, whatever it has, it gets. Here's another example, car X costs 30000 and has ~370hp, car Y costs the same, and only has ~270. What do you do? You say car X slaughters Y in terms of performance, because it does, you don't say "well car Y looked a bit better, and if Car Y were slowed down to it's level it would be better"

Run what you brung. Stop complaining because nV vendors don't have to ship stock cards ;)

Probably already said, by why no 280GTX SLI comparison... I mean you are comparing the flagship of ATI to the 2nd tier (SLI) of Nvidia.

Sorry but 4870x2 is 2 GPUs you need to compare it apples to apples.

The x2 ends up being VERY close to a single GTX 280 in performance, I see no issue with the comparison :confused:

But to answer WHY it wasn't done other than that, is because an SLi/XFire showdown is in the works, a few posts above this ;)
 
While I wouldn't presume to speak for Kyle on the subjet, the BFG Technologies Geforce GTX OC card has a miniscule overclock. It has a 13MHz GPU core overclock, 54MHz shader overclock, and the memory is run at stock speeds. That overclock doesn't bring the performance up an appreciable amount over the reference clocked Geforce GTX 280.

I have two of those cards and they aren't any better than the reference clocked cards. If we were talking about the BFG Technologies Geforce GTX 280 OC2, and OCX cards that would be another matter, but the standard OC card has an overclock that's not even worth mentioning.
 
Not a good answer, in fact, there is no good answer anyone could give.
So I won't ask anymore.

Testing a stock card vs and overclocked, Without data for overclocking the X2 as equally was simply a bad call. Period, Done, No more..

/thread.
 
While I wouldn't presume to speak for Kyle on the subjet, the BFG Technologies Geforce GTX OC card has a miniscule overclock. It has a 13MHz GPU core overclock, 54MHz shader overclock, and the memory is run at stock speeds. That overclock doesn't bring the performance up an appreciable amount over the reference clocked Geforce GTX 280.

I have two of those cards and they aren't any better than the reference clocked cards. If we were talking about the BFG Technologies Geforce GTX 280 OC2, and OCX cards that would be another matter, but the standard OC card has an overclock that's not even worth mentioning.

It doesn't matter how much it was OCed, all that matters was it was not at stock clocks so the comparision is not fair.

Don't think Im trying to defend the X2 here, because I already have one and I am Happy with it.

All Im trying to say and understand is why on earth, in all fairness and objectivity that reviews are SUSPOSED TO BE, especially from [H] would you do the benchmarks without data for an overclocked X2 as well.

Even if it was 1-2%, it would have still made the comparision fair, added more data to the review, made the review better and more wholesome.

I did look forward to the [H] review more than any other, but once I saw it wasn't stock vs stock, and there was NO DATA for the X2 being overclocked the same percentages, I stoped reading. Useless.

I would be saying the same thing if [H] did the same thing, except this time, kept the GTX 280 @ stock and OCed the X2, Im not defending the cards or reuslts, Im saying whats fair and whats not with respect to what is expect from reviews.
 
Probably already said, by why no 280GTX SLI comparison... I mean you are comparing the flagship of ATI to the 2nd tier (SLI) of Nvidia.

No. The Geforce GTX 280 is the flagship NVIDIA card. The ATI Radeon 4870 X2 is ATI's flagship card.

Sorry but 4870x2 is 2 GPUs you need to compare it apples to apples.

I can't believe people still think this way. The 4870 X2 is a single card. The Geforce GTX 280 is a single card. The 9800GX2 is a single card. Why does it matter if the card has one or two GPUs on it? The Geforce GTX 280 and the Radeon 4870 X2 are both single cards. That's apples to apples.
 
Back
Top