Nvidia to launch DirectX 10 chip in mid-November

chris.c said:
No, no, PRIME1 has me convinced, they made the GX2 because they have 1337 skills. LOL.
I'm not sure why the GX2 upsets you so much. Do you have stock in ATI? Did the GX2 shrink your epenis? Do you suffer from dual gpu envy?

If ATI could somehow fit a dongle on to one card they would. Their chips are just too big and hot.

There will be a G80 GX2, no problem. (The current GX2 is based off of their mobile chips).
 
Krenum said:
hmm.. putting the card out before windows vista is retail seems kind of pointless to me , but as they stated its more of a symbolic move... I plan on getting one but like one poster said before, not until I have a copy of vista in my hands.

More pointless to delay the release of a card for seemingly no reason, if it's ready then bring it on, the card may have new features but as always the newer hardware runs the older games faster allowing for more eye candy, these cards are going to tear through DX9 games leaving us all standing around saying "WTF was that green blur?"

For all those who have high res set ups and want to play games like oblivion in max settings we still require more power and these cards should deliver. If you're waiting purely for a DX10 part then of course, wait until vista is out, we'll more than likely be heading for the refresh range at that point which as always are far superiour at running
 
PRIME1 said:
There will be a G80 GX2, no problem. (The current GX2 is based off of their mobile chips).

Planet earth calls you back from planet Nvidia. How can you be so sure about these things? Where does all this knowledge come from, do you work @ Nvidia? Just because you are one of most vocal members around doesn´t make you right all the time.

G80 GX2, no problemo you say? You just went on bashing current ATI cards baing nuclear reactors when it comes to heat. Take a good look at G80, it needs 2 PCI-E connectors and has almost tripled the transistor count when compared to G7x cards. G80 GX2 no problemo :rolleyes:
 
chris.c said:
I think the fact that my post upset you (judging by your personal attacks on me) caused you to miss my point. I'll try to help you understand my thinking.

The 7800GTX is released, which later gets beaten by the x1800 XT. When the 7800GTX 512 beats the ATi, the x1900XTX is released which beats everything. When nvidia releases the 7900GTX which fails to best the x1900XTX, they decide to bolt two of them together for the sake of being #1 (in frames per second anyway) even though they don't profit from that card as much as they would on a single GPU card that could beat a 1900 series ATi (you really can't argue with that). Now, next generation, I feel that if nvidia has a strong single GPU solution that is equal to or greater than the competition, they will have no reason to produce a GX2 type card... in turn backing up my claim that the GX2 was a panic move by nvidia when their back was against the wall and they had no steam left in the single GPU market.

Call me a f@nboy if you want, but chances are I have owned more nvidia cards than you have owned graphics cards period.

I'm sorry, personal attacks ? Where did I "attack you" ?
I was simply pointing out that both companies had what you called "panic moves". It's just not a good expression for it. They both want to be "on top" in this market to make more and more money. ATI did it with R580 and NVIDIA did it with the GX2.
I wasn't upset by your post. On the contrary, I was just trying to show you that !!!!!!ism (in this case towards ATI), doesn't let you see the whole picture.
And what exactly does having owned more NVIDIA cards than I, have to do with anything ? It doesn't make you less ATI !!!!!!, if that's what you were trying to say.

Anyway, this is way off topic, so I'm done with it.

Back on topic, I really want to see some benchmarks of G80, If the rumored specs are true, this is really a killer card.
 
sam0t said:
Planet earth calls you back from planet Nvidia. How can you be so sure about these things? Where does all this knowledge come from, do you work @ Nvidia? Just because you are one of most vocal members around doesn´t make you right all the time.

G80 GX2, no problemo you say? You just went on bashing current ATI cards baing nuclear reactors when it comes to heat. Take a good look at G80, it needs 2 PCI-E connectors and has almost tripled the transistor count when compared to G7x cards. G80 GX2 no problemo :rolleyes:

It's not triple. The only real confirmation from a NVIDIA rep, was that G80 had half a billion transistors. And that's not even 2 x 279 million of the G71. I highly doubt the 700 million transistors rumor.

As for the G80 GX2, it will probably happen in a refresh of 2007. Though I agree it's not likely, the G80 GX2 might be a reality. I'm sure NVIDIA is holding a few cards in their hands and will give us a few surprises after the G80 debuts.
 
boomheadshot45 said:
good. 500 $ for the 8800 GT? That better play every game @ 1152 x 864 with 4aa and max settings.

LOL
You must be joking :)
A 7900 GT can handle that res with everything maxed (exception made for a couple of games).
A 8800 GT will most likely run everything maxed @ 1600x1200. Otherwise, it's not an improvement at all.
 
Bloody funny thread. Everyone attacking each other over something they have yet to experience, nor for that matter know much about.

Card Wars: Attack of the !!!!!!!
 
My dad can beat up all your dads!!!!! oh em gee I win!!!! reeeeeeeeeeeee!!!!!!!!!!!!! :mad: :mad: :mad: :mad:
 
Silus said:
LOL
You must be joking :)
A 7900 GT can handle that res with everything maxed (exception made for a couple of games).
A 8800 GT will most likely run everything maxed @ 1600x1200. Otherwise, it's not an improvement at all.

I heard it can only run 640x480.. :p
 
sam0t said:
Planet earth calls you back from planet Nvidia. How can you be so sure about these things? Where does all this knowledge come from, do you work @ Nvidia? Just because you are one of most vocal members around doesn´t make you right all the time.

G80 GX2, no problemo you say? You just went on bashing current ATI cards baing nuclear reactors when it comes to heat. Take a good look at G80, it needs 2 PCI-E connectors and has almost tripled the transistor count when compared to G7x cards. G80 GX2 no problemo :rolleyes:
As I stated the current GX2 is based off of their mobile chips. Are you saying there will be no G80 mobile chips? Not to mention that the GX2 is not the first dual gpu card by NVIDIA, it was done with the 6 series as well. Just because one company can not do it, why does that make it difficult for another. NVIDIA has been making dual GPU cards for years now, they are getting pretty good at it.
 
I'm sorry, personal attacks ? Where did I "attack you" ?
I was simply pointing out that both companies had what you called "panic moves". It's just not a good expression for it. They both want to be "on top" in this market to make more and more money. ATI did it with R580 and NVIDIA did it with the GX2.
I wasn't upset by your post. On the contrary, I was just trying to show you that !!!!!!ism (in this case towards ATI), doesn't let you see the whole picture.
And what exactly does having owned more NVIDIA cards than I, have to do with anything ? It doesn't make you less ATI !!!!!!, if that's what you were trying to say.

Again, you missed my point. I don't think I can be any more remedial with you, so I wont bother.

if it's ready then bring it on, the card may have new features but as always the newer hardware runs the older games faster allowing for more eye candy, these cards are going to tear through DX9 games leaving us all standing around saying "WTF was that green blur?"

Exactly. Just think of it like a new DX9 card that's coming out. It will likely be the same leap in performance that the 6800 to 7800 was, and that alone will warrant its purchase (atleast for me).
 
PRIME1 said:
As I stated the current GX2 is based off of their mobile chips. Are you saying there will be no G80 mobile chips?

I think that you are colossally missing the point, PRIME1. No one is saying there won't be mobile g80s or that nidia COULDN'T make a dualchip g80 if they wanted. What we are saying (and correct me if I'm wrong, other posters), is that they may not see the need for it if they aren't getting kicked around a bit by Team Red. They've usually made these cards in the past to have the fastest 'single card' on the market, when it was up for debate who had the best board.
 
chris.c said:
Again, you missed my point. I don't think I can be any more remedial with you, so I wont bother.

Great! Less !!!!!!ism is always a good thing.
 
Silus said:
Great! Less !!!!!!ism is always a good thing.

I am biased towards whatever I feel is the best all around card at any given time, not a specific brand. I have owned 4 cards this generation, 2 ATI and 2 nvidia, so I have a right to argue my opinions based on first hand experience, unlike some people who go on and on about products they have never even used.
 
Dan_D said:
So could I. My point is that everyone is assuming that a GX2 model of the G80 will be produced. The fact of the matter is they may not, and no one knows for sure when the successor of these cards will come out.

I think it will depend on how ATi's R600 will stack up. If the R600 can't match the G80, then likely they may do what they did with the 6800Ultra and simply rely on it for a years time until their next architecture is ready to release.

Everyone is also worried about the successor of G80 it seems. The fact is that we don't actually know what it will be like or when it will debut. Generally the best thing to do is buy the earliest generation of a new architecture and don't worry about the refresh. Once the refresh occurs it probably won't be worth the upgrade. Then you can wait a year or more, have a pretty fast and current card, and then when the next architectural evolution comes out, grab that. This is the best way to go IMO.

Though I usually upgrade everytime something new comes out, many people don't and I think video cards are something that you should adopt fairly early if you look for longevity in a product.

I disagree. I think the refresh right after the debut of a new architecture is the right time to buy. Just look at the power requirements of that evil power hungry beast which is the g80. If they were to come out with a refresh based on the 65nm lithography process then wouldn't that version have less power requirements? Maybe then the card wouldn't needs its own seperate nuclear reactor to run it. :p
 
chris.c said:
I am biased towards whatever I feel is the best all around card at any given time, not a specific brand. I have owned 4 cards this generation, 2 ATI and 2 nvidia, so I have a right to argue my opinions based on first hand experience, unlike some people who go on and on about products they have never even used.

I didn't say you couldn't express your opinion. I basically said you're just showing your bias a little too much.
Sure, I would need to own the cards to have the experience myself, but I read alot, especially in-depth reviews, to know that NVIDIA was not being, and I quote:

chris.c said:
killed in the top end market in pretty much every way possible

That's simply not true and if you read some reviews showing the cards against each other, you would see that a 7900 GTX wins some games and the X1900 XTX wins a few others. Is that "getting killed in pretty much every way" ? Through your reasoning, the X1950 XTX is also a "panic move", since they were getting "killed in the top end market in pretty much every way possible".
You need to see the whole scope. Both companies are in it for the money and the fact that they release new products to counter the competition's products, is not a "panic move", it's just competition. And as I said before, no company wins: we, the consumers, win.
 
There's always a market (albeit a small one) for the best money can provide, as companies both ATi and nVidia are both after a slice of that market.

I don't think the GX2 was a knee jerk decision, first of all I think the design and testing of the cards goes far deeper than a knee jerk reaction, I think that to be sucessful it has to be planned into the product timeline well ahead of it's release. I suspect Nvidia are quite good at prediciting what is going to happen when it comes to performance wars with ATi and have things covered well In advance.

Also the GX2 gave way to quad SLI gaming in home built machines, a goal I suspect which has been a large part of the driving force behind producing the GX2.

The way people reference Nvidia needing to use dual core video cards to beat ATI seems to make it appear to me as if they're under the impression that ATI were somehow "winning" and that the GX2 was a last minute act of desperation to beat ATI.

I think that probably stems from f@|\|b0yism (lol) towards ATI and feeling irritated that if you want the most powerful video solution for home PC's then you have to really go with the GX2 in Quad SLI.

I dunno, I mean obviously all this is just my opinion but I just don't think you can have a knee jerk rection and suddenly bring a whole new product line into play, I think there's far more planning that goes into it all and that plan probably extends way into the future, I expect Nvidia already have plans for the G90 range even though the G80 range isnt even out.
 
Frosteh said:
I don't think the GX2 was a knee jerk decision, first of all I think the design and testing of the cards goes far deeper than a knee jerk reaction, I think that to be sucessful it has to be planned into the product timeline well ahead of it's release.
I think that nVidia made the GX2 because their market analysis folks said "Hey, there's people out there that are willing to drop even more cash on a faster performing video solution. Look at <whichever companies made Quad-SLi before it became "mainstream">, they are selling it already. We should see if we cannot "standardize it".

Some more market analysis and strategic decisions later the GX2 came out. In fact most business choices are made because the ones making them believe that these choices will increase the company's profits at some point.
 
drizzt81 said:
I think that nVidia made the GX2 because their market analysis folks said "Hey, there's people out there that are willing to drop even more cash on a faster performing video solution. Look at <whichever companies made Quad-SLi before it became "mainstream">, they are selling it already. We should see if we cannot "standardize it".

Some more market analysis and strategic decisions later the GX2 came out. In fact most business choices are made because the ones making them believe that these choices will increase the company's profits at some point.

I agree.
They no longer had the fastest single GPU solution so they did something about it.
Why not just slam more GPU's on it.
Many purists will say that you need to compare a GX2 to Crossfire since it's 2 GPU's vs. 2 GPU's, but most consumers simply don't care about 'packaging' and just want the fastest TOTAL solution, which would be Quad-SLI.
 
That's_Corporate said:
Many purists will say that you need to compare a GX2 to Crossfire since it's 2 GPU's vs. 2 GPU's,

Well this purist argues against 2 seperate puchases vs 1 purchase.
 
fyi there is a lot of info about g80 at dailytech and its being discussed in the g80 specs revealed thread.
 
Back
Top