Nvidia Hints at coming GTX280 GX2

GunzofNavarone

Limp Gawd
Joined
Aug 6, 2008
Messages
191
www.guru3d.com
is reporting that a GTX280 GX2 is in the pipe.

at the current die size what will this card cost? How can it possibly less that 800 bucks

Does the rumor of this card signal that this is the only answer NV has to the 4870X2 for the forseeable future?

single or dual PCB?

heat and power draw?

performance?
 
All I read was the CEO struggling to remember the first Dual GPU card by his company, and then going on to essentially say the card that just took the performance crown isn't all that (without specifically referring to it). Even with a die shrink, they are going from a platform that overheats with an already massive heatsink... I see some rushed failures on the horizon.:confused:
 
Expect this to be the fastest thing out there, if it comes, but cost an arm and a leg.
 
i say 4 power connectors and it being like 16" long :D

i reckon that if the 65nm GTX 280 is 2 power connectors, even the 55nm GTX 280 will have 2, so this GX2 will have 4 lolz
 
If this ever comes to fruition there is no way in hell it won't be EPIC FAIL
 
Why would it be epic fail? Sure, it'd be crazy hot and ridiculously expensive, but surely someone will pay for it :)
 
Well someone has to get a i7 octocore and match it with a tri sli of gtx280 gx2 so he can log on here to hardforum and boast about it in every post.
 
Well, i hope they are not going to make x2 version of 200 series, unless they made some groundbreaking changes in such a short time, which i doubt they did. But let's see...
 
more then likely this will come out when the 55nm die shrink is put in place.
 
Well someone has to get a i7 octocore and match it with a tri sli of gtx280 gx2 so he can log on here to hardforum and boast about it in every post.

If you could afford such a system, you would probably do the same thing
 
How many power supplies have (4) 8pin PCI-E power connectors? thats some serious current draw.

It may be the fastest card on earth, most exspensive card ever, hottest card ever, most power devouring card ever. And only person who will own one is Heartlesssun.
 
they could do what they did with the 9800GX2, remove the huge memory bandwidth, die shrinks help with thermals, keep the rediculous heatsink on it and possibly introduce the first 3rd PCIe power connector on a consumer level card.

It's doable but not worth nvidia's time, it will still be hot, it will still be expensive, and it will consume ALOT of power, I just don't see Nv doing this to compete that high, it seems ATI pulled an Nvidia this gen and pushed Nvidia into a corner, much like what Nvidia did last gen to ATI.
 
So you guys don't care to see a product or performance numbers? your just going to bash it?

I don't think they will put 2 236w TDP 65 (or 55nm) chips onto a sandwich design and sell it to you guys, I think its going to feature downclocked processors (from single GPU board levels) which are probably going to have to be binned. Your probably going to be looking at a max TDP of 150w per chip, that's the max I think their sandwich design can handle, without some kind of redesign.

Lets try not to bash a product we haven't seen and cant in any context prove that is real and exists yet, lets hold off until we actually see the card, and its below the market expectations.

Edit, we are on page 2, the OP's link is just a link to the front page, no details in the post, of any possible stats to this card, but we still bash this card like its going to be slower than an 8500GT.
 
A lot of Heatlesssun bashing going on here! I happen to like him... why the hate?!
 
So you guys don't care to see a product or performance numbers? your just going to bash it?

I don't think they will put 2 236w TDP 65 (or 55nm) chips onto a sandwich design and sell it to you guys, I think its going to feature downclocked processors (from single GPU board levels) which are probably going to have to be binned. Your probably going to be looking at a max TDP of 150w per chip, that's the max I think their sandwich design can handle, without some kind of redesign.

Lets try not to bash a product we haven't seen and cant in any context prove that is real and exists yet, lets hold off until we actually see the card, and its below the market expectations.

Edit, we are on page 2, the OP's link is just a link to the front page, no details in the post, of any possible stats to this card, but we still bash this card like its going to be slower than an 8500GT.

I dont think anyone is bashing it. its more like seeing if the logistics of it would work
 
So you guys don't care to see a product or performance numbers? your just going to bash it?

I don't think they will put 2 236w TDP 65 (or 55nm) chips onto a sandwich design and sell it to you guys, I think its going to feature downclocked processors (from single GPU board levels) which are probably going to have to be binned. Your probably going to be looking at a max TDP of 150w per chip, that's the max I think their sandwich design can handle, without some kind of redesign.

Lets try not to bash a product we haven't seen and cant in any context prove that is real and exists yet, lets hold off until we actually see the card, and its below the market expectations.

Edit, we are on page 2, the OP's link is just a link to the front page, no details in the post, of any possible stats to this card, but we still bash this card like its going to be slower than an 8500GT.


QFT. I trust nVidia will release a good card if they end up releasing this. If the thing isn't appreciably better than a single GTX 280 then they will not even release it. They know better... It's not like its going to fall back to the performance of a geforce 2 series...
 
I had a 9800GX2. The Sssssstttuuuutttttttteeeeeerrrrrrrrrrrr killed it. Dual card stutter sux.
 
I see no reason that Nvidia couldn't release a GX2 290 (55nm SLI-Sandwich). You could release one around 800$ watercooled. Assuming a 750mhz clock (water cooled EVGA has a water cooled 65nm on the market at about 700mhz assuming a mere 50mhz from the die shrink is reasonable) it would be easily 75% faster than a 4870x2 (92% faster assuming perfect scaling). Done from a shear preformance crown standpoint, there really isn't anything to stop Nvidia from doing it.

Yes, it would be expensive. It would also absolutely destroy a 4870x2.

Three of them in SLI would easily pwn crysis at max settings at 25x16 too! Although, it might only take one to make it playable at max settings at 25x16.

Could it be aircooled? I don't know. Probably not as a 2 slot card, maybe if they went 55nm at 600mhz. Should be as a 3-4 slot card.

Epic Fail? No. 9800GX2 is a very good card. It does what it was designed to do very well, and thats bring SLI to a non-Nvidia chipset. It didn't have the bus to SLI them well at ultra settings, but a 290GX2 would.
 
i know price performance is thrown out the window on UBER cards, But even it the gtx280 GX2 is 10 -15% , there is not way that card could retail for under 799.

and why would NV boardmakers even sell a card that wont move many units and is very very expensive?
 
I see no reason that Nvidia couldn't release a GX2 290 (55nm SLI-Sandwich). You could release one around 800$ watercooled. Assuming a 750mhz clock (water cooled EVGA has a water cooled 65nm on the market at about 700mhz assuming a mere 50mhz from the die shrink is reasonable) it would be easily 75% faster than a 4870x2 (92% faster assuming perfect scaling). Done from a shear preformance crown standpoint, there really isn't anything to stop Nvidia from doing it.

Yes, it would be expensive. It would also absolutely destroy a 4870x2.

Three of them in SLI would easily pwn crysis at max settings at 25x16 too! Although, it might only take one to make it playable at max settings at 25x16.

Could it be aircooled? I don't know. Probably not as a 2 slot card, maybe if they went 55nm at 600mhz. Should be as a 3-4 slot card.

Epic Fail? No. 9800GX2 is a very good card. It does what it was designed to do very well, and thats bring SLI to a non-Nvidia chipset. It didn't have the bus to SLI them well at ultra settings, but a 290GX2 would.

That's just wishful dreaming.

Its physically impossible because of PCI-E specs regarding power consumption, not to mention that a 750MHz core only makes that worse.

And the TDP from two of those cores in 55nm (which is an optical shrink, not a true full shrink like 65nm to 45nm) will not make as big a difference as you think. That TDP would overwhelm all but the top of the line triple rads if watercooled.

If released, this would be a gimmick card that would cost a lot more than its worth, and it would only cost them money
 
From the CC....

"The best approach is to do both. If we could offer a single chip solution at 399, it certainly doesn’t preclude us from building a two-chip solution at something higher. So I think that having the right price, right product at each price point and the best-performing product at each price point is the most important thing."
 
they could do what they did with the 9800GX2, remove the huge memory bandwidth, die shrinks help with thermals, keep the rediculous heatsink on it and possibly introduce the first 3rd PCIe power connector on a consumer level card.

It's doable but not worth nvidia's time, it will still be hot, it will still be expensive, and it will consume ALOT of power, I just don't see Nv doing this to compete that high, it seems ATI pulled an Nvidia this gen and pushed Nvidia into a corner, much like what Nvidia did last gen to ATI.


Why would they need 3 power connectors? I converted a 6 pin to a 8 pin and it works fine on my GTX, didn't need a new power supply. The gtx 260 sli hangs with the x2 very well in most cases, so why not expect a 55 nm version of that clocked higher. Its power consumption will be right around the x2 as well.
 
All I read was the CEO struggling to remember the first Dual GPU card by his company, and then going on to essentially say the card that just took the performance crown isn't all that (without specifically referring to it). Even with a die shrink, they are going from a platform that overheats with an already massive heatsink... I see some rushed failures on the horizon.:confused:

Why do people keep saying the GTX 280 overheats?! Do you actually have one to know first hand or are you just going by bullshit internet rumors and a small percentage of people complaining about defective cards?

My GTX 280 which I have had since release idles at 40c which is less than the 6800GT it replaced which idled at 60c. Further and as a matter of fact, the GTX 280 at idle consumes less power than the 4870, and only 30 more watts at load.
 
If released, this would be a gimmick card that would cost a lot more than its worth, and it would only cost them money

Again, basing of a card you have never seen. or read the specs on, or seen performance numbers of, just wild assumptions.
 
Why would they need 3 power connectors? I converted a 6 pin to a 8 pin and it works fine on my GTX, didn't need a new power supply. The gtx 260 sli hangs with the x2 very well in most cases, so why not expect a 55 nm version of that clocked higher. Its power consumption will be right around the x2 as well.

If people clamor for higher clocks, it will exceed the 300W limit on current PCI-E 2.0 cards hence saying 3 power connectors
 
Again, basing of a card you have never seen. or read the specs on, or seen performance numbers of, just wild assumptions.

No, reread my post.:rolleyes: Its basing on the idea that vengence threw out of a 750Mhz core clocked 55nm GT200 that somehow won't exceed actual physical and industry limits

But hey, keep up speculation without thinking about how they actually design cards, and why standarsd like TDP and stuff exist in the industry
 
Why do people keep saying the GTX 280 overheats?! Do you actually have one to know first hand or are you just going by bullshit internet rumors and a small percentage of people complaining about defective cards?

Because some of them do. The first one I had got up to 106 C when using furmark, and 90 C when playin HL2 Episode 2. I RMAed it and got a new one that maxes out at 80 C in Furmark... big improvement. The funny thing is you would think that because my first card got up to 106 that the idle temp was high too, but it wasn't.
 
Just because a card runs cool doesnt mean it puts out a ton of heat. It just means the heat dissipation of the cooler is good. But the transistors still feel the heat / volts. For example, a overclocked quad core extreme can put out a TON of heat, but with watercooling, the cores might hit 50C at most. That doesnt mean though that it isn't putting out 150+ watts of heat and that the volts are probably high. It just means the water cooling loops is keeping it down.
 
I see no reason that Nvidia couldn't release a GX2 290 (55nm SLI-Sandwich). You could release one around 800$ watercooled. Assuming a 750mhz clock (water cooled EVGA has a water cooled 65nm on the market at about 700mhz assuming a mere 50mhz from the die shrink is reasonable) it would be easily 75% faster than a 4870x2 (92% faster assuming perfect scaling). Done from a shear preformance crown standpoint, there really isn't anything to stop Nvidia from doing it.

Yes, it would be expensive. It would also absolutely destroy a 4870x2.

Three of them in SLI would easily pwn crysis at max settings at 25x16 too! Although, it might only take one to make it playable at max settings at 25x16.

Could it be aircooled? I don't know. Probably not as a 2 slot card, maybe if they went 55nm at 600mhz. Should be as a 3-4 slot card.

Epic Fail? No. 9800GX2 is a very good card. It does what it was designed to do very well, and thats bring SLI to a non-Nvidia chipset. It didn't have the bus to SLI them well at ultra settings, but a 290GX2 would.

You should go see some math classes if you would really think that a GX2 GTX 280 would be 75% better than a 4870X2. :rolleyes:
 
Back
Top