Hot off the presses GT200 (rumor, labelled for silly people)

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
29,723
Info sent to me from "Those Who Know" (tm) :

GDDR3 512-bit bus @ 2100-2200mhz (140gb/s approx. bandwidth)
GT200 Core clock @ 625mhz
240 SP's (10 clusters of 24)
65nm
120TMU
120TFU
1-billion to 1.1-billion transistors

Unknown release, possibly early June but may be held to later this year depending on ATI's competition.

Yes, I know, it's time to flame this as rumor... just like everyone at G92-time ;).
 
There would definitely be a Step-Up in my future if such a thing appeared in June->mid July. I'm doubtful though.
 
using current Nvidia architecture as a guide we can assume that would also be 32 ROPs based on the 512bit bus. I wish they would get this on a 55nm process if at all possible.
 
Yes, I know, it's time to flame this as rumor... just like everyone at G92-time ;).

If only half those specs end up to be the case, that would still be pretty sweet.

Plus, you did call it on the G92, and that kind of hardware for a "cheaper" refresh made less sense than this kind of spec on a new architecture after almost two years.

I do hope they make available more than 512MB of RAM with one of these things in a new card. You would think that (at least) by that round of new cards we should be back to >512MB. lol
 
So is the video card going to require its own support stand? :D

Though some of our CPU heatsinks could probably use it.

If this is true it better not have less than 1 gig on the card. We're headed that route anyway as that being mainstream. We may as well do it now.
 
So basically it would be slightly less thatn 9800 GX2, but on single GPU? Would be pretty good..though if those rumourd R700 specs hold truth then this card will be owned.
 
So basically it would be slightly less thatn 9800 GX2, but on single GPU? Would be pretty good..though if those rumourd R700 specs hold truth then this card will be owned.

If the bandwidth is 512 and the memory is at 1024mb then this should dramatically surpass the 9800GX2 in higher res situations (if this is to be a 1 gpu config).
 
I'm pretty sure it's going to be 55nm =p

And I heard the release was July. But it's all just rumors lol.
 
If this is true, why is NVIDIA still suck on GDDR3? FFS, move on already. ATI is planning to use GDDR5. But a 512 bit memory bus sounds good so I'm not disappointed there.

Of course, these are rumours so we don't know if they are true, time will tell.
 
If this is true, why is NVIDIA still suck on GDDR3? FFS, move on already. ATI is planning to use GDDR5. But a 512 bit memory bus sounds good so I'm not disappointed there.

Of course, these are rumours so we don't know if they are true, time will tell.
GDDR5 wont be avalible for a while thats why R700 wont be using that until later batches.
 
If this is true, why is NVIDIA still suck on GDDR3? FFS, move on already. ATI is planning to use GDDR5. But a 512 bit memory bus sounds good so I'm not disappointed there.

Of course, these are rumours so we don't know if they are true, time will tell.

Cost... that's the reason. GDDR3 is much cheaper/widely available, cheap RAM + expensive chip = decently-priced card. With a larger bus, the clock speed difference doesn't hurt as much.
 
If this is true, why is NVIDIA still suck on GDDR3? FFS, move on already. ATI is planning to use GDDR5. But a 512 bit memory bus sounds good so I'm not disappointed there.

Of course, these are rumours so we don't know if they are true, time will tell.

Theres more to a video card than the number in the name of its memory. GDDR5 means shit if you can stick a bigger bus on cheaper memory and end up with more bandwidth at a lower cost.
 
realistic expectations, i'd say.

QFT, those specs are well within the relm of possibility and this is certainly very exciting news. We've been riding the G80/92 bus for far too long! Hopefully ATI has a competitive product around that time as well to keep prices somewhat in check and avoid a paper launch. And infact, if they do have a competitive product, it would be a tough choice. A big plus in ATI's favor for me is I can run Crossfire on an Intel chipset. (ARE YOU LISTENING nVIDIA!!!)
 
If those specs are true, that would be an insane card, I can see Tri SLI making some insane FPS on anything that scales even 1/2 way decent.
 
If those specs are true, that would be an insane card, I can see Tri SLI making some insane FPS on anything that scales even 1/2 way decent.
I think the specs look very realistic if not a bit tame. 512-bit is not that big of a deal and 240 SP is not even twice that of the current 8800gtx and most G92 cards.
 
My guess $600-700. Hope I'm wrong, but it's what each of my GTXs costed me.
 
240SP is less than what 9800 GX2 has. Same thing goes with TMUs
the 9800gx2 has to deal with SLI issues though. this new card will be able to fully utilize everything in 100% of games. the GT200 will easily beat the 9800 GX2 if theses specs are accurate.
 
Cannondale:
I know that :] I was just saying that those specs aren't insane. Thinking how badly memory bandwidht and usable vram amout is bottlenecking 9800 GX2 at this moment and also removing other SLI-related issues..this card could be sweet with those specs
 
If these specs are true I'm definitely in for 1, but I'll wait on its revision in 2011 Q1.
 
240SP is less than what 9800 GX2 has. Same thing goes with TMUs

Thats just about the dumbest comparison you could make with regards to the specs of this single GPU. You know full well the GX2 is a single PCIe slot SLI solution with two GPUs, on two PCB's...why would you even bother to compare specs that way?

Thats like comparing two 9600 GTs in SLI to one GTS because it has the same amout of SP's, yet that setup somehow shits all over the GTS in real world performance...
 
Info sent to me from "Those Who Know" (tm) :

GDDR3 512-bit bus @ 2100-2200mhz (140gb/s approx. bandwidth)
GT200 Core clock @ 625mhz
240 SP's (10 clusters of 24)
65nm
120TMU
120TFU
1-billion to 1.1-billion transistors

It looks like "Those Who Know" are pretty good at consolidating the most popular rumours and presenting the result as inside info. This stuff has been floating around a few other forums for a couple days now.

Personally I'm not really getting excited at the thought of a bigger, hotter GPU that runs COD4 at 120fps. At this stage hardware is a little ahead of software so developers need to catch up. I want to see what G92b looks like....if they can up the clocks a bit and come in cooler and quieter that would be something to get excited about. I would also bite on a 55nm based GX2 card for ~ $500.
 
It looks like "Those Who Know" are pretty good at consolidating the most popular rumours and presenting the result as inside info. This stuff has been floating around a few other forums for a couple days now.

At least this time around he was smart enough to present his "inside information" as rumor and speculation.
 
Very exciting news indeed, I would love to see this come into fruition and see how this monster performs!
 
Thats just about the dumbest comparison you could make with regards to the specs of this single GPU. You know full well the GX2 is a single PCIe slot SLI solution with two GPUs, on two PCB's...why would you even bother to compare specs that way?

Thats like comparing two 9600 GTs in SLI to one GTS because it has the same amout of SP's, yet that setup somehow shits all over the GTS in real world performance...
I did answer to this post 18 minutes before you made it :].
 
So basically it would be slightly less thatn 9800 GX2, but on single GPU? Would be pretty good..though if those rumourd R700 specs hold truth then this card will be owned.

How so ?
This card should be on par or faster than a single GX2. The HD 4870 is rumored to be 50% faster than a single HD 3870, which the single GX2 already is. Only the rumored HD 4870 X2 could hold a candle against this "GT200" or whatever they will call it and that's without Crossfire woes in mind.
 
Specs are realistic, but the price would be out of this world!

Well, we'll just see the high-end return to the $600-650 mark, especially if AMD/ATI doesn't provide competition.
 
How so ?
This card should be on par or faster than a single GX2. The HD 4870 is rumored to be 50% faster than a single HD 3870, which the single GX2 already is. Only the rumored HD 4870 X2 could hold a candle against this "GT200" or whatever they will call it and that's without Crossfire woes in mind.
Well those rumoured specs they have for HD4870 X2 and HD4870 it should be more than 50% increase [Well it should have like two times bigger shader performance and 2.7 times bigger texture mapping performance compared to HD3870 counterparts..and 35% bigger ROP performance. Also 95-100% bigger memory bandwidht.
 
Well those rumoured specs they have for HD4870 X2 and HD4870 it should be more than 50% increase [Well it should have like two times bigger shader performance and 2.7 times bigger texture mapping performance compared to HD3870 counterparts..and 35% bigger ROP performance. Also 95-100% bigger memory bandwidht.

The GX2 is often over twice as fast as a single HD 3870, so either this HD 4870 is a powerhouse or it will have a hard time beating GT200, which if these specs are true, will be at least as fast as a single GX2. And sometimes even faster than a GX2, since SLI woes are out.
 
Thing is that HD4870 isn't going to fight against GT200, HD4870 X2 will..and it will have 1.9 time bigger shader performance than HD3870 X2 and 2.55 times bigger texture mapping performance than HD3870 X2.. etc.
 
QFT, those specs are well within the relm of possibility and this is certainly very exciting news.
It's not news. It's an unsourced guess from a non-industry member of a random hardware forum. You shouldn't be any more excited about this than you would be if I told you that, trust me, ATi's next card is going to have its own CPU and PSU onboard.
 
Thing is that HD4870 isn't going to fight against GT200, HD4870 X2 will..and it will have 1.9 time bigger shader performance than HD3870 X2 and 2.55 times bigger texture mapping performance than HD3870 X2.. etc.

So you think AMD will, once again, compete in the high-end with a dual GPU card ?
That may be, but the rumored specs in the HD 4870 do not point to that. They do intend to take on the high-end with the HD 4870.
 
Back
Top