ATI Radeon HD 3870 X2 @ [H]

Status
Not open for further replies.
so many of you are telling me the "cut scene" runs fast while the actual game play is slow. WTF does that even make sense? im so confused

well is hard using the same driver as anandtech does?
At least in Unreal Tournament III with my 8800 GTS 640MB@650/2000 "cut scenes" runs much faster than actual gameplay.
 
You're basing this on a pre-supposition, one that is unproven. That being that in engine cutscenes have no performance relation to real gameplay.

I suspect they do, in fact I suspect it's a lot. Certainly it bears a hell of a lot more investigating than is going on here.

Otherwise you're suggesting that somehow the X2 is running the COD4 cutscene much faster than it runs the actual game, relative to the Nvidia card, and since AMD couldn't have known what Anand would bench, they couldn't have optimized for it, so this performance discrepancy of 60% or more would have had to just happen naturally,

Yeah, that's a whole nother kettle of highly unlikely fish.

Oh and I even forgot, have historically Anand and other benches been so far different fro H? I strongly suspect not, so once again one wonders why they are suddenly only now being tabbed as drastically innaccurate...using the same methods.

Also Kyle's entire thesis vis a vis Crysis performance, suggests strongly that AMD is cheating the benchmark. But where is that proof? Shouldn't such cheating be fairly easy to discover if it exists?

We'll never get to the answer here but I suggest we take both sides with a grain of salt..

It kinda seems like you think [H] is biased to nvidia- if you read a couple of threads you'll see that before this review that a couple of guys from [H] were interested in the 3870X2.
 
Wow, I usually dont even bother looking at H reviews anymore, but this one was particularly bad. They only benched two cards, and over 4 games? Methinks this "bet playable settings" nonsense is so difficult that Kyle got lazy and just drastically cut down the amount of benchmarking he actually does. Most other sites benchmark at least a half dozen cards over 8-10 games, giving much more data. For example, Kyle's review doesn't even tell us if the X2 is faster than a 3870 alone, we just have to take his word for it as he says it. Let alone any other cards. I pity anybody who would try to make a X2 buying decision on Kyle's review alone as it tells us almost nothing. Luckily Kyle must know myriad other sites with real benchmarks are just a mouse click away, otherwise I doubt he would have done such a unenlightening, stripped down review.

I have always said if H wants to do something different than the umpteen sites doing benchmarks, then he should do a similar format to the other sites but use only manual fraps benchies. This best playable setting apples to oranges crap has needed to go from day one.

Thanks for still reading. Sorry you can't wrap your head around it being done the right way. Don't feel bad, there are still a lot of people that do not grasp the value of our evaluations or even have the ability to understand that we spend tremendously more resources doing it the way we think it right. I wish I could run a bunch of canned benchmarks and look in the mirror in the morning and know I did right by our readers, but I can't. It would be terribly cheaper to do it the easy and fast way.

But I am glad that you come back and give us the page hits because that is exactly what we need in order to support our efforts. So, thanks for hating us and coming by to tell us how bad we are, because it still puts money in our pockets to help us pay for the next POS article we will do for you. :D
 
You're basing this on a pre-supposition, one that is unproven. That being that in engine cutscenes have no performance relation to real gameplay.

I suspect they do, in fact I suspect it's a lot. Certainly it bears a hell of a lot more investigating than is going on here.

Otherwise you're suggesting that somehow the X2 is running the COD4 cutscene much faster than it runs the actual game, relative to the Nvidia card, and since AMD couldn't have known what Anand would bench, they couldn't have optimized for it, so this performance discrepancy of 60% or more would have had to just happen naturally,

Yeah, that's a whole nother kettle of highly unlikely fish.

Oh and I even forgot, have historically Anand and other benches been so far different fro H? I strongly suspect not, so once again one wonders why they are suddenly only now being tabbed as drastically innaccurate...using the same methods.

Also Kyle's entire thesis vis a vis Crysis performance, suggests strongly that AMD is cheating the benchmark. But where is that proof? Shouldn't such cheating be fairly easy to discover if it exists?

We'll never get to the answer here but I suggest we take both sides with a grain of salt..

Oh and finally Anand did at least one game test that fully lived up to everything H demand, A fraps bench of live gameplay, Bioshock, and X2 won handily. Of course since Brent didn't bench that game we cant compare anyway.


You can make these arguments all day long, but all I have to say is that we played the actual game to evaluate the card's performance in said game. You can argue all day long on your unproven results, but we know we are right, and our evidence suggests others are wrong. I can live with that. No go on about arguing what you do not have the ability to prove.
 
I wouldn't call it laziness. I would, however, like to see how it scales across multiple resolutions. And a Quad Core test bed would be nice too :)
Would be nice to have both Apples to Apples and Apples to Oranges.
Bottom line though, canned benchmarks suck.

HEAR, HEAR...!

Now, there's a thought!!!

With and without AA, please.

And please, more 'APPLES to APPLES' comparisons!!!
Anything without it and it won't be complete!!
 
I would reserve judgement on this card until the 8.45 v1.3 drivers are released because AMD said those drivers would increase performance across the board. Until then, you cannot come to any conclusions on this card
 
Anyways my contention with these cards was 2X 3870X2's vs 2X 8800 Ultras. And it seems the ultras would crown the former easily.

Yes and they would also cost over twice as much. As well as not being a "single card" for that matter.

Anyways, dual GPU cards always kind of suck. Whether in Nvidia's case (7950 GX2, which sucked) or AMD's. Lots of driver problems, terrible inefficiency, etc. They typically only seem to exist as a stopgap. The question now is where are the true next gen cards..
 
Usually cutscenes are not doing the same types of things that happen in game. An easy comparison that many of you can see for yourself - check out the Video Stress Test in counter-strike source - is that at all like the actual gameplay of css? Where's the gunfire, grenades, barrels flying everywhere, bots, etc etc?
 
Thanks for coming out! and being a good ol fanboy. LOL

Ive been reading [H] reviews for god knows how long- but just became a forum user this week LOL

Now let me ask you- why exactly is it stupid the benched two cards? This is stating simply that ATI did not catch up to 8800 GTX or 8800 Ultra (somehting that came out a year and a half ago.

We compared to the two cards we did because they are close in price, therefore a great comparison. I saw little reason in comparing anything else. ATI was aiming at NVIDIA's Ultra, but the cost is still out there. You can argue the Ultra is the flagship, but it is still so expensive, I dont think many people give it honest buying consideration. The Ultra is a non-product in my mind that just does not carry the value for the price. When you move down to GT/GTS, the 3870 X2 is just too expensive to be compared there. It does not carry value at that price point.

And thanks for registering btdvox, glad to have you aboard.
 
I would reserve judgement on this card until the 8.45 v1.3 drivers are released because AMD said those drivers would increase performance across the board. Until then, you cannot come to any conclusions on this card

Meh, theres always some "miracle" driver waiting in the wings from both camps. My question is why do these miracle drivers never find their way to the launch date? Surely when the boards off to be reviewed thats when you want the card to be performing at its absoloute best? I know performance generally does increase but i doubt its to the degree that companies like to claim.
 
I'm not gonna get into methodology debate but more into a choice of benchmarked products:

Why was 3870x2 tested against 8800gtx only?
It should have been tested against 8800gt or gts 512 as they offer much higher value than GTX and then against 2x8800gt SLI as that can be bought for not that much more and against 3870 in CF to see if native supports give anything.
 
Wow this thread reminds me a lot of the 2900XT one. Sad to see the performance is lacking.. hopefully NV will release a new monster anyway though.
 
so many of you are telling me the "cut scene" runs fast while the actual game play is slow. WTF does that even make sense? im so confused

That CoD4 cut scene is literally 30 seconds long with you staring at the guy in front of you while he smokes a cigar. That does not show how well these cards do in an actual gameplay at all. Mean while our run-through is an entire level of running and gunning.
 
I'm not gonna get into methodology debate but more into a choice of benchmarked products:

Why was 3870x2 tested against 8800gtx only?
It should have been tested against 8800gt or gts 512 as they offer much higher value than GTX and then against 2x8800gt SLI as that can be bought for not that much more and against 3870 in CF to see if native supports give anything.

8800 GTX is price comparable
 
Isn't it why people tweak their hardware - CPU, GPU and memory - and try to squeeze the LAST frame per second to have their rig perform to the max.

I am sure that AA is useful and recommended at times, but at the resolutions you test GPUs these days, AA is more of a hindrance than a benefit when it comes to first person shooters.

How did the cards perform WITHOUT AA? Substantially faster, or not enough to justify leaving the AA off?
I am sure there are some people out there who would like to know.

Also, about those "sh!tty" canned tests. Do you suspect they are set up so that it favors one card over another?
As long as the same ones are used for benchmarks, does it not give a relative performance display of the respective cards running those tests?
Sure, it's nowhere near an accurate picture of gameplay performance, but at least you should get a glimpse into the card's performance when compared to others.

We evaluate video cards in terms of the best experience they can produce. We look at it from an immersive experience angle. This is totally subjective.

If no AA is your thing, then I fully understand. Maybe we turn off AA, turn off AF, turn the resolution down to 640x480 and see which one does best?

One thing I do know about those shitty canned benchmarks is that most of the time they in no way represent the gameplay you will experience.

I think we just showed you what the card did compared to the other when it comes to really playing a game. If you need a canned benchmark to explain that to you, then HardOCP is not the site for you. Run away now and never come back, you will likely be happier if you did.
 
How do we know from Kyle's benchmark that he wasn't doing more heavy firefighting while testing the 3870 x2 as compared to testing the 8800gtx. Thats would make a world of a difference. I think these sites should start posting video reviews when it comes to benchmarking.
 
Great review. Im glad hard ocp is doing the testing that actually matters. It doesnt matter what numbers are gotten with the canned benchmark, what matters is the numbers gotten with the actual gameplay. No one can play a canned benchmark, even if it WERE entirely accurate to the performance you "should" get.

Its funny to see people flaming [H]ard because they are fans of ATi and its tough to see them fail repeatedly (i should know i was a fan of ATi until the G92s came out), but they should be glad. Hard just saved people grief of buying the HD 3870x2 thinking it was going to wipe the floor with the GTX.

I agree that there should be an 8800GT SLI and 8800GTS comparison though, the 8800GTX only proves that its still a good card, but the 8800GT SLI and 8800GTS are whats important to people who are looking to buy a new graphics card.
 
Video reviews would be an interesting addition to the standard text reviews. :)
 
As Kyle has mentioned, I'd like to reiterate, timedemos/cutscenes/flybys do NOT equal real-world gameplay performance and behavior. We launch the game, we play the game, just as you do when you play the game, we record framerates each second, find the highest settings that are playable, and report all of this information. It is as real as it gets, we use these video cards for what they are meant for, playing games.

You will find that timedemos/cutscenes/flybys give different results compared to actually playing the game. As I mentioned also in the article, I believe future drivers can improve R680 performance, there is more potential there IMO to be delivered. It is new, give it time, I am confident performance will increase further, and I certainly want it to.
 
was reading anand's review. Apparently people only see the graphs and not what they used to test the GPU. Who the hell buys a GPU to watch cutscenes, play the built in benchmarks, or watch flybys. Of course AMD/Nvidia will optimize for those types of things as its easier. Bioshock was at least done manual but just a quick run thru a small section isnt good enough. The other "games" they used didnt even list how they tested it.

Bioshock: "Our Bioshock test involves a quick run through a section of the Medical Pavillion. Since the game has no built-in benchmark, our runthrough is manual and we rely on FRAPS to measure the average frame rate. All of the enemies are already dead in the area so we don't run the risk of running into one of them during our test."

CoD4: "lacks any sort of in-game benchmark so we benchmark the cut scene at the beginning of the first mission"

Crysis: "Crysis ships with a built-in GPU benchmark"

Enemy Territory: "We used our island benchmark for ET: Quake Wars as always"

UT3: "We used the built-in vCTF flyby in Unreal Tournament 3"

Witcher: "ran FRAPS during the game's first major cutscene at the start of play"
 
Strange that [H] review is so much different than other reviews that have come out? Tom's, Anand and Hardware Canucks are the only others i've read so far but they all show the 3870x2 winning pretty much accross the board even in Crysis. [H] says they got new drivers from ATi but didn't use them for the review? It seems you might want to redo the review with the new drivers you recieved as your data does not match up with anyone elses.
 
As Kyle has mentioned, I'd like to reiterate, timedemos/cutscenes/flybys do NOT equal real-world gameplay performance and behavior. We launch the game, we play the game, just as you do when you play the game, we record framerates each second, find the highest settings that are playable, and report all of this information. It is as real as it gets, we use these video cards for what they are meant for, playing games.

You will find that timedemos/cutscenes/flybys give different results compared to actually playing the game. As I mentioned also in the article, I believe future drivers can improve R680 performance, there is more potential there IMO to be delivered. It is new, give it time, I am confident performance will increase further, and I certainly want it to.

I agree. AMD said that the 8.45 v1.3 drivers will improve the performance of the card especially in crysis
 
lol. wow kapow! i've never really liked the way [H] did their benchmarking either, but I understand the logic behind it, kinda. take any review with a grain of salt. there's always some kind of variable. i hate crysis. it wasn't the awesome game everyone thought it would be. it was just another game with awesome requirements. i used to be a gamer at heart, but got all caught up in an upgrading frenzy. for the real life gamers i believe [H] does great reviews for their benefit. as they do the real world tests along with best possible settings kind of thing. i've become a real hardware whore, and will switch between cards just to play with it (no i am not filthy rich. far from it). so canned benchmarks, and demos are the way to go for that as they are "universal". like people that love showing off their 3dmark scores. amd/ati is trying hard, but it just doesn't seem like it's enough. when will quadcore gpu's come out? bwahaha! now that, would be exciting! that's if software dev's use it. and where the heck has physx crap gone? like someone said in another room already... die shrinks and more cores, nothing really revolutionary. where are all the people that think outside the box at? i'm really disappointed in the graphics card manufacturers at this point. hopefully matrox, intel, or some other company weasels in and grabs our attention because these super expensive duals between ati and nvidia just aren't that exciting anymore.
 
Great review. Im glad hard ocp is doing the testing that actually matters, It doesnt matter what numbers are gotten with the canned benchmark, what matters is the numbers gotten with the actual gameplay. No one can play a canned benchmark, even if it WERE entirely accurate to the performance you "should" get.

I agree that there should be an 8800GT SLI, 8800GTS, and maybe even an 8800GTS SLI comparison though.


We have to make a call and work inside the time constraints we have. I think a 8800 GT SLI match up would be good, as you can do that for about the same price as a 3870 X2, but we just don't have the time. If we took an afternoon and ran a bunch of canned benchmarks, we could give you as much meaningless data as other sites did, but that is not our style. So we have to give our readers the best we can with the time limitations that we have to work with. We had to redo ALL our 3870 X2 testing with the 2nd "new" driver they sent us, THEN we had to go back and do checks with the 3rd "new" driver AMD sent us. If all we did was run a 30 second time demo, that might not be a big deal, but the way we do it, it means some major work on Brent's part that has to be done precisely.
 
Bioshock: "Our Bioshock test involves a quick run through a section of the Medical Pavillion. Since the game has no built-in benchmark, our runthrough is manual and we rely on FRAPS to measure the average frame rate. All of the enemies are already dead in the area so we don't run the risk of running into one of them during our test."

wtf, may as well load up an empty map, or just stare at a wall.
 
As Kyle has mentioned, I'd like to reiterate, timedemos/cutscenes/flybys do NOT equal real-world gameplay performance and behavior. We launch the game, we play the game, just as you do when you play the game, we record framerates each second, find the highest settings that are playable, and report all of this information. It is as real as it gets, we use these video cards for what they are meant for, playing games.

You will find that timedemos/cutscenes/flybys give different results compared to actually playing the game. As I mentioned also in the article, I believe future drivers can improve R680 performance, there is more potential there IMO to be delivered. It is new, give it time, I am confident performance will increase further, and I certainly want it to.


Anand did a fraps of in game play of BioShock and some other games and the 3870x2 still trounced everything. Something is definitely not right here...you guys might want to retest with those newer drivers you recieved.
 
Strange that [H] review is so much different than other reviews that have come out? Tom's, Anand and Hardware Canucks are the only others i've read so far but they all show the 3870x2 winning pretty much accross the board even in Crysis. [H] says they got new drivers from ATi but didn't use them for the review? It seems you might want to redo the review with the new drivers you recieved as your data does not match up with anyone elses.


NO IT IS NOT STRANGE AT ALL. We don't test using real world gameplay for "funsies." We test with it because canned benchmarks are less and less times now days representing actual gameplay.

Go buy a card based on those reviews. It is a good product and unless you have a 30" monitor, you don't have enough pixels to ever see the cards limitations. Then you can say they were right and we were wrong and feel good about it. :)
 
So did Hardocp use the very newest AMD drivers sent to reviewers or not for Crysis in the published benchmarks? So far I'm getting the answer is no?
 
Anand did a fraps of in game play of BioShock and some other games and the 3870x2 still trounced everything. Something is definitely not right here...you guys might want to retest with those newer drivers you recieved.

RTFA. WE DID TEST WITH THE NEWER DRIVERS.
 
was reading anand's review. Apparently people only see the graphs and not what they used to test the GPU. Who the hell buys a GPU to watch cutscenes, play the built in benchmarks, or watch flybys. Of course AMD/Nvidia will optimize for those types of things as its easier. Bioshock was at least done manual but just a quick run thru a small section isnt good enough. The other "games" they used didnt even list how they tested it.

Bioshock: "Our Bioshock test involves a quick run through a section of the Medical Pavillion. Since the game has no built-in benchmark, our runthrough is manual and we rely on FRAPS to measure the average frame rate. All of the enemies are already dead in the area so we don't run the risk of running into one of them during our test."

CoD4: "lacks any sort of in-game benchmark so we benchmark the cut scene at the beginning of the first mission"

Crysis: "Crysis ships with a built-in GPU benchmark"

Enemy Territory: "We used our island benchmark for ET: Quake Wars as always"

UT3: "We used the built-in vCTF flyby in Unreal Tournament 3"

Witcher: "ran FRAPS during the game's first major cutscene at the start of play"

Matthew just told me what that cutscene in COD 4 actually involves, it involves the character staring at a guy smoking a cigar for 30 seconds. That isn't exactly going to show you how a video card performs in COD 4.

So, you guys will have to decide for yourselves, which kind of data are you really wanting to know about when looking at video card performance.

Our COD 4 test involves the entire level of The Bog, from start to finish, in the full version game, with latest patches.

/shrug, which one is more important to you?
 
wtf, may as well load up an empty map, or just stare at a wall.

Exactly.....Reminds of all the sites that used to use FarCry, but the demo included no bad guys, or firefights, or anything........it is simply the easy way out and has been the accepted way of doing things so they get away with it.
 
I'm with [h] on this one! Surprising though; it's not a clear cut winner in real-world tests but pwns in synthetic and demos. My guess is when 9800gx2 hits NV will be in the clear again. I'm still holding onto my 8800gts 512.
 
Yeah they done bioshock with dead enemies, so basically an empty map with nothing going on. Thats some outstanding gameplay.

But it still begs the question, in the absence of being GPU limited, why on earth AMD would do markedly better in such a scene relative to Nvidia..
 
Our COD 4 test involves the entire level of The Bog, from start to finish, in the full version game, with latest patches.

I think a lot of the concern going on with cod4 is due to the smoke effects, some posts on other forums are saying things like how can they ensure the run through is exactly the same, looking at different areas in the benchmarks skews the results, what if that smoke bomb went of in the x2 run through but not in the gtx run through. etc etc

Not my words, but after poking around thats some of the things people are swinging from the rafters about.
 
So did Hardocp use the very newest AMD drivers sent to reviewers or not for Crysis in the published benchmarks? So far I'm getting the answer is no?

Should you take the time read the article, or even this thread, you would know. But here, I will quote it for you especially. :)

From the conclusion page:

During our evaluation processes AMD made some breakthroughs in Crysis with the 3870 X2 driver. We are using one of those driver builds they provided for us in our testing, which smoothed out performance compared to the driver build we had started with. A couple of days ago AMD came to us with yet another new driver build that further improves 3870 X2 performance, however that driver did not make it into our evaluation. AMD had noted Crysis improvements up to ~60% on the built-in GPU test, but going back and spot checking this in real world gameplay, we saw at best a 2 fps improvement at times. No other games we were using to evaluate were noted as having improvements, so we moved on with the previous driver as noted.
 
We launch the game, we play the game, just as you do when you play the game, we record framerates each second, find the highest settings that are playable, and report all of this information. It is as real as it gets, we use these video cards for what they are meant for, playing games.

How do you guarantee that the same things happen in the game from one benchmark to the next? Do you have self-made checkpoints in the game? A specific path? While the results are true to live gameplay, are the results true to each card? How much does it differ from one benchark to the other?
 
While it seems like a lot of people do not want to believe Kyle's results, i think it makes perfect sense. I have always looked at anand and other website reviews(including hadocp) and was always kinda surprised on the different results each posted. Every time i looked at fps, anand would get

and looked at what i would get with the same card, i would always never get the performance they showed at the same clocks and specs. I guess what i am trying to say is thank you Kyle and the hardocp team. I know exactly what results and performance i would get if i were to buy this card. I

wish other sites would do their testing this way.
 
But it still begs the question, in the absence of being GPU limited, why on earth AMD would do markedly better in such a scene relative to Nvidia..

That is what I wondered. As inaccurate as those benchmarks may be. The cards are still judged by the same measure.
Now, whether they are accurate TO REAL LIFE GAMING that is another story.
 
one good thing about canned benches is that they are reproducable across different rigs.

if I have RIG A and you have RIG B ... we can both run the same canned bench at the same settings and get a general idea which machine is faster.

I highly doubt a Radeon x1800 is going to beat a 8800GT in a CAN BENCH no matter what...

I dont think you can go by JUST canned tests...but i also dont think u can say they are of no value.
 
Status
Not open for further replies.
Back
Top