9800 GX2 specs and photo

He asked at what (which game(s)), does this happen.
Sorry..english is my fourth language :p.

These percentages were actually average difference from multiple games. On that Firingsquads tests 8800 GT SLI did have small winning margin on Half Life 2 (1600x1200 lost with 0.5%, 1920x1200 won with 1%, Crysis won with 25% margin at both resolutions and BioShock (won with 5% margin at both resolutions). In all other cases there was over 30% difference

Bigger wins(+50%):
Lost Planet
Company of Heroes
Oblivion
-----
 
I love it, today we also get the inq, coming out and saying the stock clocks are 600/1500/2000. Then we got these people saying that the clocks were leaked from MSI and they are 660/1650/2400.

I understand that it could be an OC card number, but even then. I think someone is jerking our chain on this.
 
Some sites say the 18th, and some are saying the end of the month for release too...

Yeha, nVidia needs a PR department...

We need to know the scoop!! :)
 
come tuesday we will know if its worth the wait.. thats if they decide to show us it
 
im still cornfuzzled as to why these factory coolers on almost all cards still do not exhaust heat out the back of of the case instead of dumping it wherever :confused: Other than that, i await cebit as well for specs, pricing etc etc.
 
yeah, so the 30% means little considering it's an SLi card and performance increase will vary with the game and drivers. Maybe there will be an exciting card 4-6 months prior to Crysis 2 launch
 
So we have a card that's almost identical spec'd to an 8800GTS (x2), and who's performance depends largely on drivers? Those better be some damn good drivers =o. Anyway, not doing anything until I see much benchmarks, somehow I don't feel that bad that my step-up is going to run out :p.
 
I still have some hope on an aftermarket watercooling solution for the GX2...

The odds arent good, but you never know?

I too wonder why nVidia dosen't go out of their way, to make sure they don't dump the hot air inside your case...
 
And that little review sure pisses me off. What a waste, if that's really going to be the product, I think I'll just pickup a 3rd 8800GTX for 299 and go with Triple-SLI and wait for the next generation.
 
Yeah, the gx2 really seems like its 2 gts' slapped together... and the clock speeds are nothing amazing. I'll just wait until they run crysis on this card and then I'll decide whether or not it's worth it.
 
I just find it amazing that Nvidia makes this huge technological breakthrough with the 8 series and it seems like their next step in technology is less then a baby step by comparison with the 9... The jump from the 7900GTX to the 8800GTX was astronomical. Now we're getting a 9 series card that's only 30% more powerful then the previous generation? Something about that doesn't sit right with me. I feel like this is what the 8850GX2 should be and maybe there is some new technology around the corner that Nvidia has in their pocket. I hope that's true...
 
Thanks for the link. ;)

If heat is an issue... At least it's G92 heat. It could be worse...

That last Dude in the thread reported:
"it is ment to perform 1.3x better then a single 8800GTX, i believe the report hasnt changed".

That would be incorrect. It is reported to be at least 30% faster than an 8800 Ultra, just for the record! :)
 
Thanks for the link. ;)

If heat is an issue... At least it's G92 heat. It could be worse...

That last Dude in the thread reported:
"it is ment to perform 1.3x better then a single 8800GTX, i believe the report hasnt changed".

That would be incorrect. It is reported to be at least 30% faster than an 8800 Ultra, just for the record! :)

Hehe yeah, I think he meant Ultra. Maybe? :)
 
For those of you questioning the final clocks, this guy I know sent this along today, though I have no clue how he got it:

http://i27.tinypic.com/y0aaf.jpg

As you can see 600-1500-2000 are the actual speeds, although only the core and memory are shown in the picture.
 
I would love to see Nvidia hit some new level of performance with their SLI drivers...

I wonder as SLI goes, if the way the cards are connected on the GTX, if it would be a micro-jiffie faster than SLI'ing through a mobo like most do?

I bet nVidia is trying to walk the line between how much faster they would like to be stock, but making sure not to put too much heat on the chips.

I wish the companies would go high end on the stock cooling. Then they could clock their cards even faster on it's initial release, and we would be sport's about it, and throw a couple of extra bucks in on it's sale price. As long as we didn't have to take all the stock stuff off like 90% of the time...

We wouldn't want a lesser card because somebody went cheep on the cooling...
 
That screenshot above came from the site I write for. I know for a fact that nobody authorized disclosure beyond our private files.... I am currently trying to figure out how to get it removed from its hosted location.
 
In case the pictures aren't doing it for you, Hexus has a video.
That video is like sweet seductive manure.

"It's fast" doesn't do it for me. Seeing real world benchmarks will. Someone needs to develop some sort of machine that can travel through time.
 
Back
Top