3870X2 (R680) Video @CES doing COD4 2560x1600 AA on

Status
Not open for further replies.
The 3870 x2 looks like the shape of a penis. WTF is that about? extra long cooler with a fan at the very tip FTL.[/QUOTE]

Well, Build a vagina shaped case, insert penis GPU, Enjoy. :D
 
Did anyone else notice how bad the game was playing though? It was totally choppy. They should have used a lower resolution or turned off the AA and had it smooth than trying to run it so high and being crappy. I bet it was at like 20fps.
 
Did anyone else notice how bad the game was playing though? It was totally choppy. They should have used a lower resolution or turned off the AA and had it smooth than trying to run it so high and being crappy. I bet it was at like 20fps.

Well the problem with that is ati were showing it running at that res with fsaa, people wre also playing it on that machine, the crappy framerate must have been an artifact of the video stream. Would make no sense for ati to show the game playing poorly let alone let people play the game running poorly on their new flagship card. Also im sure one of the sites that pictured it talked about playing the game on that machine, i dont recall them saying it was sluggish or whatever.

Actually this article here a guy mentions that he was playing it on that card.

http://www.mercextra.com/blogs/taka...sday-experiencemeeting-with-amds-phil-hester/
 
Did anyone else notice how bad the game was playing though? It was totally choppy. They should have used a lower resolution or turned off the AA and had it smooth than trying to run it so high and being crappy. I bet it was at like 20fps.

Well as a comparison, how does it run on mainstream cards (e.g. 8800GTX or 8800GTs) at that resolution?
 
to be honest that video was so laggy prob like 20-40 frames. Not sure if you guys can spot that or not.
 
Well as a comparison, how does it run on mainstream cards (e.g. 8800GTX or 8800GTs) at that resolution?

I can run COD4 at 2560 x 1600 maxed out w/ 4 xAA reso on my Ultra fine, though I knock it to 2xAA for multiplayer and cause the difference between 2 and 4 doesn't justify the drop
 
I disagree. The 3870 x2 looks like the same old shit, a pcb with a cooler. NVIDIA is trying to do something different.. that in itself is cool.

Yeah, goddamn those formulaic microprocessor engineers. What the fuck are they thinking, using silicon semiconductors to make millions of transistors on a small chip and then sticking it in a circuit on a PCB?!? Where is the innovation? Where is the quantum-nano-biological chip that popular science promised me last month?

Are you seriously criticizing them for building a graphics card the same way all graphics cards have ever been built ever?
 
BitBoys really knew how to innovate. ATI/NVIDIA can't touch them.

EDIT: Except for buying them out I guess. :)


 
Well as a comparison, how does it run on mainstream cards (e.g. 8800GTX or 8800GTs) at that resolution?

My system in signature, I have COD4 installed playing 2560x1600rez 2xAA/16xAF all options maxed, and it plays smoth as butter, actually the game is not that nice looking, I think HalfLife2 at same rez looks a little more detail.

So my single 8800GTX + E6850 + 4gb DDR2 runs COD4 very nice
 
actually the game is not that nice looking, I think HalfLife2 at same rez looks a little more detail.

You got problems if you think HL2 has better detail than CoD4, I've gone through HL2 twice prior to playing CoD4, and CoD4 looks far better in detail than HL2 ever did. Matter of opinion obviously.
 
You got problems if you think HL2 has better detail than CoD4, I've gone through HL2 twice prior to playing CoD4, and CoD4 looks far better in detail than HL2 ever did. Matter of opinion obviously.


I have only played the beginning of COD4 so far, the boat scene was nothing special, but in HalfLife2 the opening court yard area in beginning of game is jaw dropping cool
 
I have only played the beginning of COD4 so far, the boat scene was nothing special, but in HalfLife2 the opening court yard area in beginning of game is jaw dropping cool

Well the boat scene was dissappointed, I played COD4 on my PS3 and said wtf at first but later in the streets it looks much better.
 
I have only played the beginning of COD4 so far, the boat scene was nothing special, but in HalfLife2 the opening court yard area in beginning of game is jaw dropping cool

Wait until you play the sniper level. Not just for the way the game looks, but just how awesome it is playing that level :D
 
Well the problem with that is ati were showing it running at that res with fsaa, people wre also playing it on that machine, the crappy framerate must have been an artifact of the video stream. Would make no sense for ati to show the game playing poorly let alone let people play the game running poorly on their new flagship card. Also im sure one of the sites that pictured it talked about playing the game on that machine, i dont recall them saying it was sluggish or whatever.

Actually this article here a guy mentions that he was playing it on that card.

http://www.mercextra.com/blogs/taka...sday-experiencemeeting-with-amds-phil-hester/

Umm, how does only the game run crappy and not the rest of the video stream? They should have just turned down the settings a bit or turned off AA if they wanted to show off the card.

Its like playing Crysis on an GTS 512 at 1920x1200 and getting 5fps and then using that to show off how powerful the card is...:eek:
 
Umm, how does only the game run crappy and not the rest of the video stream? They should have just turned down the settings a bit or turned off AA if they wanted to show off the card.

Its like playing Crysis on an GTS 512 at 1920x1200 and getting 5fps and then using that to show off how powerful the card is...:eek:

I agree, the video looks fine to me, except that crappy computer running COD4 like piss. My 8800GTX runs COD4 native rez 2560x1600 and is much smoother than that looked
 
I have only played the beginning of COD4 so far, the boat scene was nothing special, but in HalfLife2 the opening court yard area in beginning of game is jaw dropping cool

?? the rain effects and the "wet" feel to it was awesome....HL2 is still a damn good looking game, but COD4 takes the cake.
 
Umm, how does only the game run crappy and not the rest of the video stream? They should have just turned down the settings a bit or turned off AA if they wanted to show off the card.

No idea, but dont you think if the game was running choppy it would have been mentioned in the article or by the load of online sites that have seen the ces ati booth?
 
No idea, but dont you think if the game was running choppy it would have been mentioned in the article or by the load of online sites that have seen the ces ati booth?

Maybe because they don't want to down talk the card and then not get invited back?
 
You got problems if you think HL2 has better detail than CoD4, I've gone through HL2 twice prior to playing CoD4, and CoD4 looks far better in detail than HL2 ever did. Matter of opinion obviously.

some scenes of hl2 : ep 2 does look better than cod4, the most annoying thing about the cod4 engine is that the characters have this weird shield like glow around them
 
Trying to do something different, aka bolt 2 cards together chop off one of the pcie connectors and laughably market it as a "single card"...oh wait they done that with the 7950gx2 AND the 7900gx2 as well, yup really very different...


And whens the last time ati or nvidia released a card with 2 gpus on the one pcb? Its been tried by sapphire and the like but this is the first time in years one of the vendors has actually done it.

WOW EPIC FAIL. LET'S HIT THE RECAP BUTTON BUDDY!

LOLOLOLOL LERN2REED.

I said I was talking about the AESTHETICS, not the LAYOUT OF THE PCB'S, HOW MANY TIMES DO I HAVE TO SAY THIS? Someone said it looked good, and I simply said I disagree, the PCB and cooler look like every other board ever made. NOTICE THE ABSENCE OF REFERENCE TO THE DESIGN/LAYOUT/TECHNICAL FEAT of having 2 GPU's on the same PCB.

What does having the GPU's on 1 PCB get you? Does that really matter?

The only way I could see it making a difference is for aftermarket cooling solutions. Other than that it's like spinners on a car.

"OHH LOOK THEY SPIN"

"..So? What do they do?"

"NOTHING BUT THEY SPENT A BUNCH OF MONEY ON THEM SO THEY'RE OBVIOUSLY COOL. LOLOL LOOK AT THEM GO!"

Ya.. ok.


Admin Edit: (9) No SHOUTING (Posting in all capital letters) It’s hard to read and just plain rude and annoying.
 
Ok Captain Caplock

What does having a "cool" looking HS matter when it works and works well. Besides when it is installed and the side is on the case does it really matter what it looks like.
 
Ok Captain Caplock

What does having a "cool" looking HS matter when it works and works well. Besides when it is installed and the side is on the case does it really matter what it looks like.

Of course, Captain Obvious, nothing gets by you.

Obviously, we're "discussing things" such as "new technology" which have come up as "topics" and we are free to "express opinions."

Now it's obvious no one will see any card very much with the side on, as you have so deftly pointed out. The point I made was that the aesthetics are part of the purchasing experience, whether you like it or not, and we're discussing the upcoming technology. So I said I think NVIDIA is aesthetically doing something different, and I get flamed with this "LOLOL W/E 7950... DUAL PCB LOLOL" bull.

So I clarify that I'm not talking about the PCB layout (which you can't see in NVIDIA's case anyway), I'm talking about the wrap around the card. That is something that hasn't really been done before as far as the appearance goes. I needed to clarify my point for the 13 year olds on the forums who can't read.

I never put emphasis on the looks as important, I simply followed up on someone saying the (sch)long AMD looked awesome.. I "disagreed," as I am entitled to do.

And you're right, if it performs well, then great. But chances are it will be limited by the same BS that Crossfire normally is (i.e. poor support).

Get it yet?
 
Well, this thread sure became flamebait for no reason at all.

As I mentioned in the beginning of the thread, CoD4 is a very poor choice, to show off this card. If it's supposed to be the most powerful card out there (their words), it should be running the most demanding game out there i.e. Crysis. Not CoD 4. Even F.E.A.R. is more demanding that CoD4, so my point remains, especially when a single card, which is over a year old, like the 8800 GTX, can already run this game good enough @ the same resolution and with AA.
 
Well, this thread sure became flamebait for no reason at all.

As I mentioned in the beginning of the thread, CoD4 is a very poor choice, to show off this card. If it's supposed to be the most powerful card out there (their words), it should be running the most demanding game out there i.e. Crysis. Not CoD 4. Even F.E.A.R. is more demanding that CoD4, so my point remains, especially when a single card, which is over a year old, like the 8800 GTX, can already run this game good enough @ the same resolution and with AA.

I dont think it was a bad choice, since CoD4 was considered one of the better games of the year and they where playing to that crowd.
 
I dont think it was a bad choice, since CoD4 was considered one of the better games of the year and they where playing to that crowd.

You are still missing my point. They should be proving that their card, runs the most demanding game out there, better than the competition. It's not about running the most popular game, because if it runs the most demanding one acceptably (Crysis), it sure will run the most popular one and less demanding (CoD4 or any other game) much better. Popularity of a given game, is not important to test a new graphics card, unless it's coupled with how demanding it is for the hardware. That's not the case of CoD4. It is popular, but no where near as demanding as Crysis.
 
Of course, Captain Obvious, nothing gets by you.

Obviously, we're "discussing things" such as "new technology" which have come up as "topics" and we are free to "express opinions."

Now it's obvious no one will see any card very much with the side on, as you have so deftly pointed out. The point I made was that the aesthetics are part of the purchasing experience, whether you like it or not, and we're discussing the upcoming technology. So I said I think NVIDIA is aesthetically doing something different, and I get flamed with this "LOLOL W/E 7950... DUAL PCB LOLOL" bull.

So I clarify that I'm not talking about the PCB layout (which you can't see in NVIDIA's case anyway), I'm talking about the wrap around the card. That is something that hasn't really been done before as far as the appearance goes. I needed to clarify my point for the 13 year olds on the forums who can't read.

I never put emphasis on the looks as important, I simply followed up on someone saying the (sch)long AMD looked awesome.. I "disagreed," as I am entitled to do.

And you're right, if it performs well, then great. But chances are it will be limited by the same BS that Crossfire normally is (i.e. poor support).

Get it yet?

How about shut up and stop making childish posts ruining a perfectly fine thread?
 
Meh, you know how it is at these shows. They'll push the settings as hard as they can, like having the game literally run at 20-30 FPS to say " LOOK, WE HAVE THIS GAME RUNNING AT (Insert insane resolution) (#AA/#AF)" to get oooohhs & ahhhs from people. I'll wait for some real benchmarks from real testers. Like the guys we have here at [H] I definitely don't judge a card off of it's 3D Mark. That's the easiest way for them to tattoo " Sucker " onto your head.
 
Meh, you know how it is at these shows. They'll push the settings as hard as they can, like having the game literally run at 20-30 FPS to say " LOOK, WE HAVE THIS GAME RUNNING AT (Insert insane resolution) (#AA/#AF)" to get oooohhs & ahhhs from people. I'll wait for some real benchmarks from real testers. Like the guys we have here at [H] I definitely don't judge a card off of it's 3D Mark. That's the easiest way for them to tattoo " Sucker " onto your head.

LOL
 
Do you guys not realize that this is GREAT news!!!

i mean think about the fierce competition thats going to follow after the release of the 3800 x2. Not only that, the possibility of future price wars will be upon us again.

yay for AMD for giving their best!
 
Do you guys not realize that this is GREAT news!!!

i mean think about the fierce competition thats going to follow after the release of the 3800 x2. Not only that, the possibility of future price wars will be upon us again.

yay for AMD for giving their best!

i have not seen any shred or evidence to suggest AMD have somthing up its sleeves with this card. untill they can show us some real benchmarks i wont beleive a thing. if this card can play crysis at very high at an average of 40-50fps then maybe we have somthing to drool about but until then my money is going to nvidia
 
I don't know if it's great news or not. I'm waiting for it to land in the guys I trust. Anandtech, [H], etc.

While the first benchmark is at 2560 x 1600, the Q6600 is running at stock. That got an "eh" out of me.

Then with the second benchmark 1280 x 1024 & the Q6600 all of a sudden running at 3330ghz, it became obvious to me that whoever ran the test was trying to inch out every point possible. They didn't use the best hardware out there, but that was there goal. Note that no AA was used.

As for the COD 4 video. We all know why they chose that game. It's the hottest FPS out for the PC atm. It's not the best choice to show off a " high end " card. Especially the way that marketing dude was trying to hype it up. Choosing COD 4 will leave people asking a lot of questions.

" Isn't this a DX 10 card as well? Why show off a DX 9 title? "
" What OS was the game running on? "
" Why is the game running so choppy? "
" What AA/AF was used during the running of that game? "\
" What was the systems specs?"

They might of not went with Crysis since that game is under Nvidias Iron Grip, that's the only excuse I would accept. The card is also most likely running on premature drivers. I'm not impressed really to be honest. I am fair though & this is the first showing of the card. Overtime as more information is released, I'll have a better idea how it really performs. Most of us will
 
You are still missing my point. They should be proving that their card, runs the most demanding game out there, better than the competition. It's not about running the most popular game, because if it runs the most demanding one acceptably (Crysis), it sure will run the most popular one and less demanding (CoD4 or any other game) much better. Popularity of a given game, is not important to test a new graphics card, unless it's coupled with how demanding it is for the hardware. That's not the case of CoD4. It is popular, but no where near as demanding as Crysis.

Don't forget up until very recently Crysis did not lend it self to multi-gpu enviroments very well. Seems things are a bit better in that regaurds. But if your want to show how good your card is, and its using a mutli-gpu configuration at heart, then showing it off in a game that does not seem to have the most optimal multi-gpu perfroamance maybe not the best way to go....
 
Don't forget up until very recently Crysis did not lend it self to multi-gpu enviroments very well. Seems things are a bit better in that regaurds. But if your want to show how good your card is, and its using a mutli-gpu configuration at heart, then showing it off in a game that does not seem to have the most optimal multi-gpu perfroamance maybe not the best way to go....

Recently for who ? End users ?
Coordination between software houses and hardware manufacturers (at least architecturally), is common practice, in terms of game optimizations and driver support. Just like Cyrtek had a 8800 to test Crysis on, before it was even released to the public, the same goes for game optimizations to support a certain card setup. They are usually not ready for public release, but good enough to show off the new hardware and/or game.
Also remember that CoD4 didn't have a very good start, in terms of Crossfire support, so it's the exact same thing and serves of no excuse, to not use the most demanding game out there: Crysis.

So I ask two questions:

1) Why would I buy a new graphics card to play CoD4, if the one I already own, runs it at max settings @ my monitor's native resolution ?

2) Can I do the same with Crysis ?

Some people may not be interested in Crysis, but no one can argue that Crysis is the most beautiful and demanding game out there and it's the game, that will put new hardware to the test, since every other game, is just breakfast for cards like the 8800 GT/GTX
 
WOW EPIC FAIL. LET'S HIT THE RECAP BUTTON BUDDY!

LOLOLOLOL LERN2REED.

I said I was talking about the AESTHETICS, not the LAYOUT OF THE PCB'S, HOW MANY TIMES DO I HAVE TO SAY THIS? Someone said it looked good, and I simply said I disagree, the PCB and cooler look like every other board ever made. NOTICE THE ABSENCE OF REFERENCE TO THE DESIGN/LAYOUT/TECHNICAL FEAT of having 2 GPU's on the same PCB.

What does having the GPU's on 1 PCB get you? Does that really matter?

The only way I could see it making a difference is for aftermarket cooling solutions. Other than that it's like spinners on a car.

"OHH LOOK THEY SPIN"

"..So? What do they do?"

"NOTHING BUT THEY SPENT A BUNCH OF MONEY ON THEM SO THEY'RE OBVIOUSLY COOL. LOLOL LOOK AT THEM GO!"

Ya.. ok.



"epic fail" ..welcome to the internet circa 2000. Rest of your post is like an angsty child posting in caps to get attention. Quite sad, not even worth reading.
 
Maybe because they don't want to down talk the card and then not get invited back?

Thats like saying if a site gives it a bad review they wont get any more cards from ati, the 2900 series got plenty of negative reviews with sites still receiving the 38XX versions of the card.
 
Status
Not open for further replies.
Back
Top