Nvidia Hints at coming GTX280 GX2

If they release such a card, it would obviously be a low-volume product. Having the world's fastest videocard is good for PR and helps them sell more low-end and mid-range cards. That would be the only reason to release such a card.
 
Thousand dollar video cards are fairly common.

Will the average gamer get one? Probably not.

Will I get one? Nope, I can't afford something like that, let alone the power bill.
 
i think the profit margins on the 8800Ultra was much much higher than the alleged GTX280 GX2

Nv prices the GTX280 at 649.99. That what it wanted to sell it at... but ati has forced them to drop it down.

That would mean a gtx280 GX2 would cost 1200 bucks a pop. 800 may be too low an estimate.
 
I think during the time when the 8800GTX/Ultra released, games were ahead of GPU that guys/gals wants every fps they can get. So they were willing to pay more. These days, most GPU have the power to run most games at a decent frame rate, so paying more doesn't make much sense. If there is a great game that pushes the GPU to the max today, people will certain pay more to have the best fps.
 
If people clamor for higher clocks, it will exceed the 300W limit on current PCI-E 2.0 cards hence saying 3 power connectors


Since the die size will be smaller it will consume less power, so there is some play to increase clocks, possible not as high as some are speculating but there is room and they can keep it in the gx260 sli power envolope which is very smiliar to the x2 power envolope.
 
Since the die size will be smaller it will consume less power, so there is some play to increase clocks, possible not as high as some are speculating but there is room and they can keep it in the gx260 sli power envolope which is very smiliar to the x2 power envolope.


Problem is that people are overstating die size decreases. A 55nm node is a half-node, hence an optical shrink. Primary purpose is to increase # of die per wafer. Yes, power consumption will go down a bit, and clocks can likely incrrease. But not by 30%

This isn't 65nm to 45nm
 
true, but nV could be doing other things as well to keep the power level lower, just don't know (well might in a couple of weeks). They might not need a seperate nvio either. Actually comparing previous gen SLI to dual card power usage, the dual card tends to have lower power usage. So its possible for them to get a decent boost in clocks, and they really only need to increase shader clocks.
 
I think during the time when the 8800GTX/Ultra released, games were ahead of GPU that guys/gals wants every fps they can get. So they were willing to pay more. These days, most GPU have the power to run most games at a decent frame rate, so paying more doesn't make much sense. If there is a great game that pushes the GPU to the max today, people will certain pay more to have the best fps.

u make a valid points. Few games even push the 4850 right now.
 
I see no reason that Nvidia couldn't release a GX2 290 (55nm SLI-Sandwich). You could release one around 800$ watercooled. Assuming a 750mhz clock (water cooled EVGA has a water cooled 65nm on the market at about 700mhz assuming a mere 50mhz from the die shrink is reasonable) it would be easily 75% faster than a 4870x2 (92% faster assuming perfect scaling). Done from a shear preformance crown standpoint, there really isn't anything to stop Nvidia from doing it.

In what games do you see this easy 75% gain over 4870x2, Crysis? imo from what I've seen in most other games 4870x2 is usually on par with the 280 sli in terms of performance.
 
I don't think I would buy an x2 card from nvidia, all I've ever heard about the 9800 gx2 is that it was terrible.
 
checking the rig sigs here and other places... i would say 1000$ video cards account for .01% of cards sold

I didn't say people buy them a lot. I am saying they exist, and they aren't surprising anymore especially considering the cost of workstation grade cards.

I'd also like to know how you think by checking the signatures of users on 1 forum you can ball park a sales figure like that.
 
You should go see some math classes if you would really think that a GX2 GTX 280 would be 75% better than a 4870X2. :rolleyes:

Starting point, 4870x2 is 30% faster than GTX 280.

Assuming your shader limited,

GTX 280 is a 600Mhz card stock. There exsits a EVGA watercooled card at ~700mhz (692?697? somethign, don't care to look it up.)

Assuming a mere 50mhz from the die shrink. The 9800GTX to 9800GTX+ was something more like 60-75mhz iirc.

Lets start doing our math.

Our basic unit will be the GTX 280 = 1. As we previously stated the 4870x2 = 1.3. Accounting for the clock speed increase of a watercooled die shrink, 1*750/600, a single GTX290 watercooled card = 1.25. Putting two of them in SLI yields, 1.25*2 = 2.5. I will add in a SLI scaling factor later. Getting a ratio of a 4870x2 to the new watercooled GX2 290, 2.5/1.3 = 1.92, applying a 9% loss from SLI scaling (newer games we're seeing numebers near this) and you've got 1.92*.91=1.747.

So please tell me, where did my math fail? I tried to stay conservative on most of my numbers.
 
If they release such a card, it would obviously be a low-volume product. Having the world's fastest videocard is good for PR and helps them sell more low-end and mid-range cards. That would be the only reason to release such a card.

QFT. The only reason to release such a card would to "shut ATI up". It would simply be a move to say, hey look at us, we pwn you.
 
Starting point, 4870x2 is 30% faster than GTX 280.

Assuming your shader limited,

GTX 280 is a 600Mhz card stock. There exsits a EVGA watercooled card at ~700mhz (692?697? somethign, don't care to look it up.)

Assuming a mere 50mhz from the die shrink. The 9800GTX to 9800GTX+ was something more like 60-75mhz iirc.

Lets start doing our math.

Our basic unit will be the GTX 280 = 1. As we previously stated the 4870x2 = 1.3. Accounting for the clock speed increase of a watercooled die shrink, 1*750/600, a single GTX290 watercooled card = 1.25. Putting two of them in SLI yields, 1.25*2 = 2.5. I will add in a SLI scaling factor later. Getting a ratio of a 4870x2 to the new watercooled GX2 290, 2.5/1.3 = 1.92, applying a 9% loss from SLI scaling (newer games we're seeing numebers near this) and you've got 1.92*.91=1.747.

So please tell me, where did my math fail? I tried to stay conservative on most of my numbers.

Edit, nm. My math was wrong too. :D
 
In what games do you see this easy 75% gain over 4870x2, Crysis? imo from what I've seen in most other games 4870x2 is usually on par with the 280 sli in terms of performance.

iirc, from the [H]ard review, in Crysis a single 280 was near a 4870x2. In AoC 4870x2 was 30% faster than a SINGLE 280. They are really the only games that challenge these cards. Looking at numbers over 100FPS doesn't mean much.
 
I for one one will never buy another dual pcb Nvidia card again. I had a 7950GX2. Also bought a 9800GX2 as I was thinking Nvidia wouldn't pull the same thing again, right? This time it will be better.

Nope!
 
iirc, from the [H]ard review, in Crysis a single 280 was near a 4870x2. In AoC 4870x2 was 30% faster than a SINGLE 280. They are really the only games that challenge these cards. Looking at numbers over 100FPS doesn't mean much.

Crysis is really the only game where the Geforce GTX 280 really shines compared to the 4870 X2. In Age of Conan the 4870 X2 manhandles the Geforce GTX 280. I actually play Age of Conan so the 4870 X2 is more compelling for me at present.
 
It may be the fastest card on earth, most exspensive card ever, hottest card ever, most power devouring card ever. And only person who will own one is Heartlesssun.

If its what I want and what I can afford at the time maybe.:)

A lot of Heatlesssun bashing going on here! I happen to like him... why the hate?!

Thanks, there seems to be a lot of hate going on towards people who spend money on their rigs.

i dont think its hate...more like jealousy i know i am jealous and freely admit it

I appreciated the honesty!;)

I've got to give it up to AMD, they really executed well this cycle. At the end of the day however, when talking about 4870x2 CF performance vs 3x SLI GTX 280 the only noticeable difference most of the time is price. I do wish that I had not bought this 790 motherboard and I would have dumped it except that its working so well for me that I'll worry about it when Nehalem has had a little field time. I'm probably not going to jump on Nehalem still spring next year unless its just all that and a can of Schiltz.
 
No, reread my post.:rolleyes: Its basing on the idea that vengence threw out of a 750Mhz core clocked 55nm GT200 that somehow won't exceed actual physical and industry limits

But hey, keep up speculation without thinking about how they actually design cards, and why standarsd like TDP and stuff exist in the industry

Power is a big issue. On both ends. Disappation would probably have to be watercooled, and a large radiator.

Input power could be solved a variety of ways. One would be a external power connector/ auxiliery powersupply. I don't like the solution but it would be a solution.

I don't think your going to be able to do it inside the 300Watt envelope, your probably going to be around 550Watts.
 
Crysis is really the only game where the Geforce GTX 280 really shines compared to the 4870 X2. In Age of Conan the 4870 X2 manhandles the Geforce GTX 280. I actually play Age of Conan so the 4870 X2 is more compelling for me at present.

I agree, crysis isn't typical of the speeds. Which is why I preformance difference from AoC to base my numbers on, not Crysis.

But this is all widespeculation anyways. "Lets throw some numbers together and see what falls out!" And what falls out is you end up with a card that would require watercooling, violate the 300Watt PCI-E enevelope, and faster than hell. :cool:
 
Even if this card does exist, I can imagine the cost, the power its going to consume and probably the size of this card is going to be. You thought a 280 GTX and a 4870x2 consumes enough power already.

So each chipset has a 512 bit memory bus I assume, which makes a total of 1GB bit memory bus? I like to see that.
 
i think the profit margins on the 8800Ultra was much much higher than the alleged GTX280 GX2

Nv prices the GTX280 at 649.99. That what it wanted to sell it at... but ati has forced them to drop it down.

That would mean a gtx280 GX2 would cost 1200 bucks a pop. 800 may be too low an estimate.


Thats a ridiculous claim to say its going to be 1200... the GTX 280 was 649.99 AT LAUNCH... With your same logic then the 4870x2 should be 620 dollars since the 4870 was around 310 at launch. Doesn't make much sense if you ask me...
 
I'll bet 4870x2 isn't stressed even by crysis until aa is cranked. Assuming ultra shitty crossfirex scaling nets you barely any gain whatsoever,
I'll bet once you crank up the aa up to 8x, one 4870x2 will be on par with a gtx 280 sli.

Until I can see 60FPS out of Crysis on the 4870 X2 at 2560x1600 with 4xAA and 16x AF at a minimum I'm going to say that Crysis stresses the 4870 X2 plenty. I've got dual Geforce GTX 280's in SLI and I can't do that. The Geforce GTX 280 grreatly outclasses the 4870 X2 in that one game. So if the Geforce GTX 280 SLI setup can't do that then there is no way a single 4870 X2 can.
 
Until I can see 60FPS out of Crysis on the 4870 X2 at 2560x1600 with 4xAA and 16x AF at a minimum I'm going to say that Crysis stresses the 4870 X2 plenty. I've got dual Geforce GTX 280's in SLI and I can't do that. The Geforce GTX 280 grreatly outclasses the 4870 X2 in that one game. So if the Geforce GTX 280 SLI setup can't do that then there is no way a single 4870 X2 can.

My fault, I went back and reread the [H]review once again and noticed the part where aa crushed both cards at 1920x1200. Sorry, sleep deprivation here :(

What really confuses me is that at 1920 with aa set to 0, the crossfirex setup averages 19.2 fps but once aa is cranked all the way up to 8x, the average fps goes up to 25.3?!?!
 
The 8800Ultra cost that much, and they still accepted it

there was also very little offering from ATi at this point. I expect the 4870X2 to be lower than 475 dollars before this GX2 is released, if it will be released at all, and if the speculative price of 800 dollars is right, it would seem smarted to go 4870x2 crossfire over the GX2, as alot of people already have the Intel chipset to support crossfire, as where very few use SLI chipsets.
 
If nVidia makes this card with watercooler and insane price to get the performance crown, AMD's answer would be simple, watercooled HD4890 X2. Not that they would do it but they could if they want to. So saying that nVidia could create a watercooled card to win over the HD4870 X2 would be a moot point.
 
If its what I want and what I can afford at the time maybe.:)



Thanks, there seems to be a lot of hate going on towards people who spend money on their rigs.



I appreciated the honesty!;)

I've got to give it up to AMD, they really executed well this cycle. At the end of the day however, when talking about 4870x2 CF performance vs 3x SLI GTX 280 the only noticeable difference most of the time is price. I do wish that I had not bought this 790 motherboard and I would have dumped it except that its working so well for me that I'll worry about it when Nehalem has had a little field time. I'm probably not going to jump on Nehalem still spring next year unless its just all that and a can of Schiltz.


No, I'm pretty sure people aren't jealous and hate you that you spend money on a computer, it's because you're annoying and not as smart as you think you are. I think the best word to describe you would be "charlatan".
 
If nVidia makes this card with watercooler and insane price to get the performance crown, AMD's answer would be simple, watercooled HD4890 X2. Not that they would do it but they could if they want to. So saying that nVidia could create a watercooled card to win over the HD4870 X2 would be a moot point.

Uh, I'm pretty sure an overclocked 4870/4870X2 isn't going to be a permanent solution to "hold the performance crown"
 
Until I can see 60FPS out of Crysis on the 4870 X2 at 2560x1600 with 4xAA and 16x AF at a minimum I'm going to say that Crysis stresses the 4870 X2 plenty. I've got dual Geforce GTX 280's in SLI and I can't do that. The Geforce GTX 280 grreatly outclasses the 4870 X2 in that one game. So if the Geforce GTX 280 SLI setup can't do that then there is no way a single 4870 X2 can.

If Crysis wasn't so biased towards nVidia cards, I wonder how the X2 would fare against the nVidia offerings in this instance.
 
If Crysis wasn't so biased towards nVidia cards, I wonder how the X2 would fare against the nVidia offerings in this instance.

I'm guessing by that you mean: if Crysis weren't optimized for NVIDIA or ATI cards, you'd like to see how they'd compare? Even if that were the case, I think that the game would still run better on one architecture vs. the other. That's the way things were back before NVIDIA's TWIMTBP program and whatever collaboration ATI has offered for game developers.
 
Yeah, which is why i said, if it wasn't so biased towards card a, I'd like to see how card b would compete.
 
Yeah, which is why i said, if it wasn't so biased towards card a, I'd like to see how card b would compete.

I meant to beat your response, but I added a thought to my statement after you already posted clarification.

In any case I still think that Crysis would run better on one card vs. the other. That means that just because Crysis ran better on the NVIDIA or the ATI offering doesn't mean that those results would translate to other games. Crysis is very different than say Call of Duty 4 or Unreal Tournament III.
 
No, I'm pretty sure people aren't jealous and hate you that you spend money on a computer, it's because you're annoying and not as smart as you think you are. I think the best word to describe you would be "charlatan".

Name calling are we? Tisk, tisk, tisk.;) A charlatan? Wow, were did you come up with that one?:confused:

I don't claim and never have claimed to be some genius. I know something about computers, I've been a business software developer for 15 years, but I know a lot of people here know a lot more than I do. Hell, if I was all that I wouldn't be here!:p

If you are going down the road that it others here recently have (BigCactus, POPEGOLD) and start calling others names and personal smearing you probably won't be around here much longer.

I think that attacking personally people you don't know and have never met reeks of high school and doesn't belong here. And that not just my opinion, if the owners' of this site as well. http://www.hardforum.com/showthread.php?p=1026133236#post1026133236 - Rule #1. Please read it.;)
 
After owning a 9800GX2, never again will I go with a dual PCB-style card.

Also, I really hope they fix all the heat/RMA issues that has plagued the GTX 280 series. I foresee lots of problems of there really is a GTX 280 GX2 in the pipeline.
 
Sounds like they save money on old cards double them to save research time.

Or did what ATi did stay low try to stay alive and come back with a redesigned card.
Nvidia has enough resurces and fan pool to keep it afloat, while it come up with a new design.
Now price is the Killer part though lol
 
My fault, I went back and reread the [H]review once again and noticed the part where aa crushed both cards at 1920x1200. Sorry, sleep deprivation here :(

What really confuses me is that at 1920 with aa set to 0, the crossfirex setup averages 19.2 fps but once aa is cranked all the way up to 8x, the average fps goes up to 25.3?!?!

It's probably more CPU limited at 1920x1200 and when AA and AF are cranked the GPU does more work.
 
If nVidia makes this card with watercooler and insane price to get the performance crown, AMD's answer would be simple, watercooled HD4890 X2. Not that they would do it but they could if they want to. So saying that nVidia could create a watercooled card to win over the HD4870 X2 would be a moot point.

Neg Ati will answer with a dual slot 4870x4 Colossus!!!!! Asus trinity anyone?
 
Back
Top