Nvidia Hints at coming GTX280 GX2

If nVidia makes this card with watercooler and insane price to get the performance crown, AMD's answer would be simple, watercooled HD4890 X2. Not that they would do it but they could if they want to. So saying that nVidia could create a watercooled card to win over the HD4870 X2 would be a moot point.

Umm... No. A watercooled 280 trashes a watercooled 4870. There is no reason to think a water cooled 280 GX2 would not be faster than a watercooled 4870x2. Most cases SLI 280s beat a single 4870x2. At least from the benchies I've seen. And I say most cases because we all know the the SLI scaling on AoC isn't great.
 
I am willing to bet there is never a gtx280 GX2.... is a profit killer

It's been plastered all over this thread, the only reason for Nvidia to make such a card would be as a preformance crown winner, not as a profit maker. It's just like Honda and the S2000. They have the S2000 so people think Hondas are sporty. They don't sell enough of them to really make a difference on thier bottom line. According to wiki, they sold 4,302 S2000s and 392,231 Accords in '07. They would make it to help them sell thier midrange cards.
 
Name calling are we? Tisk, tisk, tisk.;) A charlatan? Wow, were did you come up with that one?:confused:

I don't claim and never have claimed to be some genius. I know something about computers, I've been a business software developer for 15 years, but I know a lot of people here know a lot more than I do. Hell, if I was all that I wouldn't be here!:p

If you are going down the road that it others here recently have (BigCactus, POPEGOLD) and start calling others names and personal smearing you probably won't be around here much longer.

I think that attacking personally people you don't know and have never met reeks of high school and doesn't belong here. And that not just my opinion, if the owners' of this site as well. http://www.hardforum.com/showthread.php?p=1026133236#post1026133236 - Rule #1. Please read it.;)

Christ, cry more. I didn't call you a charlatan, I said it describes you. Go ahead and tell on me for it too, and cry till I get banned. It reeks of elementary school.
 
Umm... No. A watercooled 280 trashes a watercooled 4870. There is no reason to think a water cooled 280 GX2 would not be faster than a watercooled 4870x2. Most cases SLI 280s beat a single 4870x2. At least from the benchies I've seen. And I say most cases because we all know the the SLI scaling on AoC isn't great.

I'd actually have to agree with you for the most part, except if they were to make the GX2, they'd have to slow the clocks down to help manage temperatures I'm sure, which would still be a considerably big upgrade from a single GTX280.
 
I meant to beat your response, but I added a thought to my statement after you already posted clarification.

In any case I still think that Crysis would run better on one card vs. the other. That means that just because Crysis ran better on the NVIDIA or the ATI offering doesn't mean that those results would translate to other games. Crysis is very different than say Call of Duty 4 or Unreal Tournament III.

Kinda agreeing there, aren't CoD4 and UTIII both TWIMTBP too?
 
Kinda agreeing there, aren't CoD4 and UTIII both TWIMTBP too?

They are, but those are just too titles I thought of randomly. I thought of HL2 which has always played better on ATI hardware but that game can practically run on anything made in the last four years fairly well. So I didn't mention it.
 
I'd actually have to agree with you for the most part, except if they were to make the GX2, they'd have to slow the clocks down to help manage temperatures I'm sure, which would still be a considerably big upgrade from a single GTX280.

Any GX2 290 would have to be watercooled. Seeing how your talking a minimum of a 500 watt power enevelope I dont see it happening on air. And hell, as long as your water cooling, might as well OC it right? Maybe they should just put on the box, minimum 1m^2 radiator :D
 
Any GX2 290 would have to be watercooled. Seeing how your talking a minimum of a 500 watt power enevelope I dont see it happening on air. And hell, as long as your water cooling, might as well OC it right? Maybe they should just put on the box, minimum 1m^2 radiator :D

lol. Well I just hope a die shrink brings us some cards faster than HD4870 at the same price. Granted, GTX 260 can be bought for few extra bucks, but I want to see that GTX 280 $50-100 cheaper so that I can buy.
 
No Nvidia is just going to shrink the gtx 280 to 55nm and squeeze as much performance as the can out of those shaders, and even if they can get 20-25% more shader clock that would be result in near linear performance boost, if the rumored specs are true with shader clock at 1648 for the 55nm version than that is going to be a big performance boost, and the best thing Nvidia can do is cut in to the performance lead of hd 4870 x2, making a dual chip card is probably non sense as it would not help them where they want to be, it would cost them more to make and I am sure the least amount of people are going to buy a 700-800 dollar card or may be even more.

IF you guys have read the recent interview of Nvidia CEO posted at techpowerup.com, Nvidia admitted that they failed to recognize AMD'S price performance ratio, so Nvidia is going to target a fast card at affordable price, I really dont see nvidia building a dual chip card which will result in higher price tag than hd 4870 x2 to get any profit from.

Remember Amd's approach on a single pcb is far more profitable than dual pcb, AMD could probably sell the hd 4870 x2 for 449 and still make profit off it. I really dont see Nvidia making a dual gpu card, if they do it will be really pointless if it costs more than 550, sorry but I would rather have a gtx+ with higher shader clock than a dual gpu card sandwiched together. and the interview clearly states that nvidia recently ramped up their 55nm transition.
 
Why do people keep saying the GTX 280 overheats?! Do you actually have one to know first hand or are you just going by bullshit internet rumors and a small percentage of people complaining about defective cards?

My GTX 280 which I have had since release idles at 40c which is less than the 6800GT it replaced which idled at 60c. Further and as a matter of fact, the GTX 280 at idle consumes less power than the 4870, and only 30 more watts at load.

I have read at least a dozen posts of people getting cards that run hot, have to be RMA'ed either once they died or shortly before death. I guess you have to own the card to give a status report on your individual experience, so people like you can say "it must be user error AR AR ARRRR". I am getting sick of people using "have you had personal experience"? Its just a worthless defense where no defense is necessary. This isn't an argument as you wish it was. I was stating what many others have experienced. What the hell is wrong with Hardforum?
 
i think its very sexy 2 think about, but very scary 2 pay for. maybe if it fell of the back of a truck.....:)
 
So, ATI using 2 GPU's per card to beat Nvidia is OK, but Nvidia doing the same to beat ATI is not OK?

More like how are they gonna come up with an X2 version of the GTX280 without setting the test boxes aflame and requiring a 1k+ watt PSU?

What Nvidia NEEDS to do is slash prices again. Yeaaah, get that 280 down to around $350 so I can grab a 260 for around $220 or so. :D
 
If they really did cut prices with the release of a new GX2 I would totally consider a GTX260 or 280 depending.
 
No Nvidia is just going to shrink the gtx 280 to 55nm and squeeze as much performance as the can out of those shaders, and even if they can get 20-25% more shader clock that would be result in near linear performance boost, if the rumored specs are true with shader clock at 1648 for the 55nm version than that is going to be a big performance boost, and the best thing Nvidia can do is cut in to the performance lead of hd 4870 x2, making a dual chip card is probably non sense as it would not help them where they want to be, it would cost them more to make and I am sure the least amount of people are going to buy a 700-800 dollar card or may be even more.

IF you guys have read the recent interview of Nvidia CEO posted at techpowerup.com, Nvidia admitted that they failed to recognize AMD'S price performance ratio, so Nvidia is going to target a fast card at affordable price, I really dont see nvidia building a dual chip card which will result in higher price tag than hd 4870 x2 to get any profit from.

Remember Amd's approach on a single pcb is far more profitable than dual pcb, AMD could probably sell the hd 4870 x2 for 449 and still make profit off it. I really dont see Nvidia making a dual gpu card, if they do it will be really pointless if it costs more than 550, sorry but I would rather have a gtx+ with higher shader clock than a dual gpu card sandwiched together. and the interview clearly states that nvidia recently ramped up their 55nm transition.

i hope noone actually believe a slight optical shrink will give nvidia the crown back....history shows it wont.
 
In order to not catch in fire, the 55nm 280 GX2 cores (air cooled) would have to run at 400MHZ!
 
So, ATI using 2 GPU's per card to beat Nvidia is OK, but Nvidia doing the same to beat ATI is not OK?

I would think it would be absolutely awsome if Nvidia came out with this product.

But alot of people need to understand this will not be the same performance as GTX280 sli'd, it will have to involve a core thats been butchered in some fashion to alow for thermals and actual power draw to be non-insane.

If you look back at the 9800GX2 they managed to combine what was essentially a 8800GTX but when they came out with the 9800GTX it made it possible to slap two of those on the same board.
 
Where on the site does it say that, would think it would be impossible the fan can't keep up with one 280 yet alone 2 on the same HSF
 
More like how are they gonna come up with an X2 version of the GTX280 without setting the test boxes aflame and requiring a 1k+ watt PSU?

What Nvidia NEEDS to do is slash prices again. Yeaaah, get that 280 down to around $350 so I can grab a 260 for around $220 or so. :D
There is a GTX260 on Newegg for like $240 AR; Palit brand.

As for the power requirement, they are probably going with a 55nm chip. They will probably also clock it down some to save power.

I would think it would be absolutely awsome if Nvidia came out with this product.

But alot of people need to understand this will not be the same performance as GTX280 sli'd, it will have to involve a core thats been butchered in some fashion to alow for thermals and actual power draw to be non-insane.

If you look back at the 9800GX2 they managed to combine what was essentially a 8800GTX but when they came out with the 9800GTX it made it possible to slap two of those on the same board.
The same works for 4870x2 though, and nobody cares. Yeah, both chips are on the same board, but you still suffer for the typical multi gpu issues.

The bottom line is, the 4870x2 is not as good as everyone says it is. Certain games, like corridor FPS for example, are tuned for FPS. The 4870x2 has a ton of throughput so it does excel in those games. But who cares when you're already getting 100fps?

I'm not going to bring up 'the game' as the game that Nvidia wins on, but point is, when it comes to actual card horsepower, the GTX280 is better. I'm really curious as to how Far Cry 2 will perform on it. I bet Nvidia wins that battle. And even in the games that the 4870x2 is 'faster in', ie. 200fps instead of 150 :rolleyes:, it's not mind blowingly faster than the 280. You pay more for the 4870x2, shouldn't it be faster, I mean, it is 2 GPU's vs one after all. The 4870x2 also has that new AA technology, but again, not everybody uses that. It's the same argument that ATI is better because of DX10.1.
 
The same works for 4870x2 though, and nobody cares. Yeah, both chips are on the same board, but you still suffer for the typical multi gpu issues.

The bottom line is, the 4870x2 is not as good as everyone says it is. Certain games, like corridor FPS for example, are tuned for FPS. The 4870x2 has a ton of throughput so it does excel in those games. But who cares when you're already getting 100fps?

I'm not going to bring up 'the game' as the game that Nvidia wins on, but point is, when it comes to actual card horsepower, the GTX280 is better. I'm really curious as to how Far Cry 2 will perform on it. I bet Nvidia wins that battle. And even in the games that the 4870x2 is 'faster in', ie. 200fps instead of 150 :rolleyes:, it's not mind blowingly faster than the 280. You pay more for the 4870x2, shouldn't it be faster, I mean, it is 2 GPU's vs one after all. The 4870x2 also has that new AA technology, but again, not everybody uses that. It's the same argument that ATI is better because of DX10.1.

Actually this is false, it beats in everything other then "that game" which no one cares about anymore, it is proven (even by developers) that it is horribly optimized, it is still a great looking game and I give them props for giving us those visuals.

But they did not do the same for the 4870x2, infact its the same exact core (not butchered in anyway) clocked the same, with MORE GDDR5, I would definitly put the X2 at a fairly monumental feat in the fact they gave the consumer everything Crossfired 4870's have with more memory and not exactly breaking the bank (its MSRP and stree price are well below the 280 debut)

I'm not sure what review site you are reading but this card does dominate in any game matched up to the 280. Look ups [H]'s comparison with AoC with a beta driver from ATI, it demolishes the 280 and that is not a corridor FPS either.

Also your comparison with the new AA methods to 10.1, yes 10.1 doesn't mean jack shit right now but its something ATI banks on, wether it gets supported or not who cares but the AA? if you buy this card and not use the new AA methods you are wasting hardware since this is where the hardware truely excels, which is the AA.

2 gpu's or not its still a single card and happens to be the fastest, I didn't mean to de-rail this thread into defending cards but thats just alot of mis-information. I truely think that if nvidia can slap 2 GTX260 cores onto one card they will be the new king, but not 280's.
 
To do that they need to clean up their faulty chip dilema.
I hope my 8800 GT isn't effected ...
 
Actually this is false, it beats in everything other then "that game" which no one cares about anymore,
Once you break 100fps then it simply doesn't matter anymore. Since there are really only 2 games that are demanding on this hardware, we will have to wait. And crysis isn't useless as there are a number of games under development using the Crytec engine. The other is AoC. Probably not going to see that many games on the same engine as that.

it is proven (even by developers) that it is horribly optimized, it is still a great looking game and I give them props for giving us those visuals.
Link please, because I don't recall the devs ever saying it was poorly optimised

But they did not do the same for the 4870x2, infact its the same exact core (not butchered in anyway) clocked the same, with MORE GDDR5, I would definitly put the X2 at a fairly monumental feat in the fact they gave the consumer everything Crossfired 4870's have with more memory and not exactly breaking the bank (its MSRP and stree price are well below the 280 debut)


Debut price on the GTX 280 is a mute point. Street price vs street price. Last I checked the street price on a 280 is near 400 and the street price on the x2 is 550. Last time I checked, 550$ would break most peoples banks.


I'm not sure what review site you are reading but this card does dominate in any game matched up to the 280. Look ups [H]'s comparison with AoC with a beta driver from ATI, it demolishes the 280 and that is not a corridor FPS either.
30% isn't domination. It's 30%.


Also your comparison with the new AA methods to 10.1, yes 10.1 doesn't mean jack shit right now but its something ATI banks on, wether it gets supported or not who cares but the AA? if you buy this card and not use the new AA methods you are wasting hardware since this is where the hardware truely excels, which is the AA.
The card excels at insane shader power, which isn't the same thing as AA. You only get "free AA" until your shader power gets tapped.


2 gpu's or not its still a single card and happens to be the fastest, I didn't mean to de-rail this thread into defending cards but thats just alot of mis-information.

4870x2 isn't just 2 GPUs slapped onto a card, there is a added bus called a "sideport" which allows the GPUs to talk directly to each other. This is the reason the 4870x2 is so good.


I truely think that if nvidia can slap 2 GTX260 cores onto one card they will be the new king, but not 280's.

There are serious problems preventing them from doing such a thing. Mostly the power/thermal envelope. Read the above posts for more info on that.
 
Once you break 100fps then it simply doesn't matter anymore. Since there are really only 2 games that are demanding on this hardware, we will have to wait. And crysis isn't useless as there are a number of games under development using the Crytec engine. The other is AoC. Probably not going to see that many games on the same engine as that.


Link please, because I don't recall the devs ever saying it was poorly optimised




Debut price on the GTX 280 is a mute point. Street price vs street price. Last I checked the street price on a 280 is near 400 and the street price on the x2 is 550. Last time I checked, 550$ would break most peoples banks.



30% isn't domination. It's 30%.



The card excels at insane shader power, which isn't the same thing as AA. You only get "free AA" until your shader power gets tapped.




4870x2 isn't just 2 GPUs slapped onto a card, there is a added bus called a "sideport" which allows the GPUs to talk directly to each other. This is the reason the 4870x2 is so good.




There are serious problems preventing them from doing such a thing. Mostly the power/thermal envelope. Read the above posts for more info on that.

add it all up boil it all down ATI is the new king nuff said
 
Why would it be epic fail? Sure, it'd be crazy hot and ridiculously expensive, but surely someone will pay for it :)

People will pay but at what price? If they charge $750 and up I wonder how many people would ever buy it? Also the heat, and the power draw would probably be enormous. I can easily power a 4870x2 with a Corsair 620W PS but I am certain that would not be the case with such a beast.

If they release it and its affordable, keeps heat down to acceptable levels and lower the power draw then its epic win. If not I agree, epic fail. Just because a card wins all the bench tests doesn't necessarily make it a winner in my book.
 
you cant dissapate 350W of heat with a delta of anything less than a 90℃delta in the space allocated from a dual slot cooler. Its just not possible. Two GT200s isn't happening, when they get it to 55nm, maybe.
 
Christ, cry more. I didn't call you a charlatan, I said it describes you. Go ahead and tell on me for it too, and cry till I get banned. It reeks of elementary school.

So if charlatan describes me then please at least provide some argument for this comment. This type of comment without any proof or context is clearly a personal attack and a violation of the rules in this forum. I don't see how following rules reeks of elementary school. Please explain that.

Look at your user name. Your comment seems to simply be flame of an nVidia user.

I just don't understand why you would make an unprovoked comment about me that you can't prove unless you're just trying to flame me. If I am in error then I appologize. :)
 
I don't see a 9800GX2 being very feasable at present. The power requirements and thermal evelope even at 55nm might just be too much. Additionally I hadn't really given this much thought as there isn't much information available about 3-Way SLI and Geforce GTX 280's but it would seem that (at least according to Driver Heaven) that3-Way SLI'ed Geforce GTX 280's beats out 2 4870 X2's in CrossfireX. In that article the Geforce GTX 280 3-Way SLI configuration beats out the 4870 X2 CrossfireX setup every time at 2560x1600. If that's true then it would seem that NVIDIA is still really on top in terms of absolute performance. You can use 3 Geforce GTX 280's together but you can only use 2 4870 X2's together in CrossfireX since CrossfireX maxes out with 4 GPUs.

That paints a bit different picture of the top end. So NVIDIA may not feel the need to release a GX2 style card at present. I'm not sure how accurate these results are. All the other 3-Way Geforce GTX 280 results I've seen don't show SLI scaling in such a positive light especially when more than 2 GPUs are present. Though the driver used in this review is newer than what all the other sites have used. These results might be a result of the tests being performed on the Skulltrail setup which uses an Intel chipset (server chipset at that) and it may also be a result of new driver optimizations that improve 3-Way SLI scaling. If that's true the 179.83 is a massive leap forward in terms of improving that scaling. Also you can argue that from a bang for your buck perspective that Geforce GTX 280's in 3-Way SLI is insane but these types of setups are a dog and pony show anyway. The average gamer won't have SLI'ed Geforce GTX 280's or 4870 X2's in CrossfireX either. Even with those points aside one can always argue that for the regular gaming masses that can't afford a D5400XS motherboard and two LGA771 CPUs that going with the required NVIDIA 680i SLI/780i SLI or 790i Ultra SLI chipset based boards isn't worth the gains over a CrossfireX setup. Sticking with Intel chipsets is certainly worth a reduction in performance given the fact that 4870 X2's in CrossfireX are massively powerful anyway.

This is also an interesting article as NVIDIA told us that the Intel D5400XS motherboard was incapable of supporting 3-Way SLI because it used the nForce 100 MCP's and not the nForce 200 MCP's. The latter of which was a physical requirement for 3-Way SLI to work.
 
If you could afford such a system, you would probably do the same thing

Um hello? This is hardforum. Anyone here who could afford that system would make it lol. I'd make 2 just so my main rig would have a buddy. :D
 
Back
Top