GT300 taped out

A more complete series of articles related to the GT300, as this is all speculation take with a kilo of salt. I'm agnostic about video cards as long as they keep the competition up between ATI and Nvidia and keep Intel out, consumers benefit in the end.

nVidia's GT300 specifications revealed - it's a cGPU!
http://www.brightsideofnews.com/new...00-specifications-revealed---its-a-cgpu!.aspx

GT300 to feature 512-bit interface - nVidia set to continue with complicated controllers?
http://www.brightsideofnews.com/new...to-continue-with-complicated-controllers.aspx

nVidia's GT300 is "smaller, faster than Larrabee"?

http://www.brightsideofnews.com/news/2009/5/12/nvidias-gt300-is-smaller2c-faster-than-larrabee.aspx

nVidia G(T)300 already taped out, A1 silicon in Santa Clara

http://www.brightsideofnews.com/new...dy-taped-out2c-a1-silicon-in-santa-clara.aspx

and this one where Theo is basically saying Charlie Tuna is off the mark this time

nVidia first to market with a DirectX 11 GPU?
http://www.brightsideofnews.com/news/2009/5/8/nvidia-first-to-market-with-a-directx-11-gpu.aspx
 
Last edited:
uh.. nvidia being first to market dx11? huh? ATI already started marketing it months ago with their news releases..

though i would read the articles.. but that website refuses to load for me..

though id still say bs on the 512bit memory.. nvidia must be on crack if they think they can make a profit off the 300 series using it.. 512bit gddr5 is the equivilant of 1024bit gddr3.. now the problem with that is, the fastest gddr5 that is in production is only 5ghz.. now why in any ones right mind would bother wasting all that money to create a 512bit memory controller for memory that isnt even close to its maximum potential? gddr5 has already been taken up and passed 7ghz.. its just not in mainstream production yet.. and the 5ghz chips just started going in production a couple months ago.. so in the end.. the question is.. will the gt300 have the old 3.6-4ghz chips.. or will it have the new 5ghz chips.. and since amd/ati designed and pushed the development of gddr5.. id take a good guess that they already bought out the entire supply of 5ghz chips..

until there are some actual hard facts and not a bunch of bogus rumors floating around to make people think nvidia's actually pulled their heads out of their butts.. its all just rumors.. so dont get your hopes to high when you find out that whats been plastered all over the news isnt even close to the final product... and yes.. i could be wrong.. but to do what everyone is saying nvidia's going to do is a suicide run to try and beat ATI.. aka.. welcome back the 600+ dollar nvidia cards..
 
With 8x64bit memory channel you really don't need second generation GDDR5 chips. With "slow" GDDR5 like "QDR 900MHz" you would get 230.4GB/s which should be enough for next generation.
 
With 8x64bit memory channel you really don't need second generation GDDR5 chips. With "slow" GDDR5 like "QDR 900MHz" you would get 230.4GB/s which should be enough for next generation.

"enough" is very relative. bumping up clock speed could be "enough" to warrant a "next generation" label for some people. I think the more they put in this monster, the better. It's about time we had something revolutionary like the 8800.
 
uh.. nvidia being first to market dx11? huh? ATI already started marketing it months ago with their news releases..

First to market = First in stores for sale.

though id still say bs on the 512bit memory.. nvidia must be on crack if they think they can make a profit off the 300 series using it.. 512bit gddr5 is the equivilant of 1024bit gddr3.. now the problem with that is, the fastest gddr5 that is in production is only 5ghz.. now why in any ones right mind would bother wasting all that money to create a 512bit memory controller for memory that isnt even close to its maximum potential? gddr5 has already been taken up and passed 7ghz.. its just not in mainstream production yet.. and the 5ghz chips just started going in production a couple months ago.. so in the end.. the question is.. will the gt300 have the old 3.6-4ghz chips.. or will it have the new 5ghz chips.. and since amd/ati designed and pushed the development of gddr5.. id take a good guess that they already bought out the entire supply of 5ghz chips..

Just because you won't want it to happen doesn't mean its not possible. Nvidia GPUs have always been bandwidth limited meaning it could always use more, so this is just the logical progression. GDDR5 has been in production for more than a year now. And I believe there are atleast 3 people that manufacture GDDR5 (samsung, hynix, and qimoda), im sure its no stretch for nvidia to goto one of them, say "yeah I need 10 million units of this particular chip". Not like ATi has a blockade on that shit.

until there are some actual hard facts and not a bunch of bogus rumors floating around to make people think nvidia's actually pulled their heads out of their butts.. its all just rumors.. so dont get your hopes to high when you find out that whats been plastered all over the news isnt even close to the final product... and yes.. i could be wrong.. but to do what everyone is saying nvidia's going to do is a suicide run to try and beat ATI.. aka.. welcome back the 600+ dollar nvidia cards..

A suicide run? What the hell are you smoking? Its called market competition. Yeah it could be bs like all the G92 rumors that were floating around for months before its actual release, but is it really that much of a stretch for you to comprehend?
 
o_O the power of the CPU in a GPU ???
These graphic card are getting mighty strong.
 
"enough" is very relative. bumping up clock speed could be "enough" to warrant a "next generation" label for some people. I think the more they put in this monster, the better. It's about time we had something revolutionary like the 8800.
Let's see:
-DX11 ...check
-GDDR5 ...check
-Over 200GB/s memory bandwidht..check
-First new GPU architecture since 2006..check
-Dual precision performance is 8-10 times bigger (depends on clocks) than with GT200..check
-512 shader units, 89% more than in GT200(270SP)...check

GT300 would get that "Next Generation" label anyways. Second Gen GDDR5 would be overkill and would be there just to increase costs.
 
o_O the power of the CPU in a GPU ???
These graphic card are getting mighty strong.
Are you talking about capabilities of CPU in a GPU? ..because raw performance of GPUs have been much higher than CPU's for long time.
 
First to market = First in stores for sale.
Just because you won't want it to happen doesn't mean its not possible. Nvidia GPUs have always been bandwidth limited meaning it could always use more, so this is just the logical progression. GDDR5 has been in production for more than a year now. And I believe there are atleast 3 people that manufacture GDDR5 (samsung, hynix, and qimoda), im sure its no stretch for nvidia to goto one of them, say "yeah I need 10 million units of this particular chip". Not like ATi has a blockade on that shit.
..and Micron announced that they will enter graphics card market. They are entering markets with GDDR3 memory, but they have GDD5 on the way.
 
Excellent news! Getting ready for my Windows 7 build and have about $4k stashed away for it now. Just need Westmere in time for Chistmas!
 
I don't really care until I see [H]'s review, and can buy one on the spot.
 
Linked to the front page.

I saw the source from somewhere else but noticed you guys are already discussing it, so here we are. (See, I'm learning... lol)
 
I'm agnostic about video cards as long as they keep the competition up between ATI and Nvidia and keep Intel out, consumers benefit in the end.

How exactly do consumers benefit from keeping Intel out of the graphics market? I say bring it. If Intel puts nVidia/ATI to shame with a killer product, that will really put pressure on the market and that seems like it's good for everyone. I'm cheering for Intel in this round (not that I will necessarily buy a Larrabee, but I hope it is competitive at least).
 
I say intel will fail with the first 2 releases, if not more at us gamers, but might make a revolution in the gpgpu!

However, thats what i say about intel in graphics, there is no way they are gonna just be up with ati and nvidia from the start, if it happens ima gonna run 10 miles everyday for a year!

Anyways, to the specs, good good, and WTF ARE NVIDIA THINKING.

What ami thinking about, well, shader power? hell no, DX11? no, 512 bit GDDR5, EHH? go with lower bit cause i dont need that 256 gb/secs, i really dont need 130 to be honest!

Cut down cost, ati is going to kill you guys again cause of their simple designs and scaleable designs!
not saying nvidia isnt able to make good cards, but strategy wise, i think ati have a edge!

And this is my full option in full honesty, ati is making so much more money with a lower cost per card, and same performance, just see that nvidia, really see what ati does, smaller die, smaller bus, same performance.
 
I say intel will fail with the first 2 releases...

Maybe or maybe not but that's not what you should be hoping for. The (essentially) two vendor performance graphics market has been boring for a while. The most interesting thing to happen recently is the flame war (however one-sided) between nVidia and Intel, going as far as to get the Mythbusters involved.
 
oh and given that NV cant even get a die shrink right at present... and have canceled 2 of their shrinks. Yeah. Go ATI :p
 
Let's see:
-DX11 ...check
-GDDR5 ...check
-Over 200GB/s memory bandwidht..check
-First new GPU architecture since 2006..check
-Dual precision performance is 8-10 times bigger (depends on clocks) than with GT200..check
-512 shader units, 89% more than in GT200(270SP)...check

GT300 would get that "Next Generation" label anyways. Second Gen GDDR5 would be overkill and would be there just to increase costs.
since when did the gt200 have 270sp? I thought you had mistyped but you even based your 89% off the 270sp figure. :confused:
 
since when did the gt200 have 270sp? I thought you had mistyped but you even based your 89% off the 270sp figure. :confused:
GT200 has 240 single precision shaders PLUS 30 shaders exclusively for double precision work..since those normal shaders can't do it. This is major weakness in GT200:
-Dual precision performance is poor
-These units can't be used in single precision work
-One of the reasons why GT200 is so big

Now GT300 would have 512 MIMD cores; no need for group of shaders that would exclusively do double precision work. This means huge performance increase:
GTX 295 - 149 gigaFLOP/s
GT300 - 1000-1200 gigaFLOP/s (Depending on clocks)

So one GT300 could be that 6.5-8 times faster in dual precision calculation than 2xGT200. There's no dual precision work in games..basically it could mean that Nvidia's Tesla-products could become pretty popular.
 
Last edited:
nV's last gasp? I wouldn't touch this card or anything else from nVidia.
 
This is relevant to my interests.

I hope I wouldn't have to modify or adjust any of my rigs too much to accommodate this beast. I wonder if Nvidia can get it to need only two 6 pin pci-e plugs.
 
Excellent news! Getting ready for my Windows 7 build and have about $4k stashed away for it now. Just need Westmere in time for Chistmas!

dude, 4k? wow, if you do that right you'll have a rig of the month or year on your hands.
 
I just hope some new games are going to be released to test both green and red's DX11 cards.
 
GT200 has 240 single precision shaders PLUS 30 shaders exclusively for double precision work..since those normal shaders can't do it. This is major weakness in GT200:
-Dual precision performance is poor
-These units can't be used in single precision work
-One of the reasons why GT200 is so big

Now GT300 would have 512 MIMD cores; no need for group of shaders that would exclusively do double precision work. This means huge performance increase:
GTX 295 - 149 gigaFLOP/s
GT300 - 1000-1200 gigaFLOP/s (Depending on clocks)

So one GT300 could be that 6.5-8 times faster in dual precision calculation than 2xGT200. There's no dual precision work in games..basically it could mean that Nvidia's Tesla-products could become pretty popular.
wow I have never heard that before. why such a strange number like 30 though? it seems odd because things are usually divisible by 8.
 
wow I have never heard that before. why such a strange number like 30 though? it seems odd because things are usually divisible by 8.

Maybe there are 32 and 2 are spares in case of failure.
It might be a convenient way to bring power consumption/heat down a tad as well.
 
Maybe there are 32 and 2 are spares in case of failure.
It might be a convenient way to bring power consumption/heat down a tad as well.

GT200 has 30 processors. 8 single precision alus and 1 double precision alu per processor. Hence the 240 sp and 30 dp.
 
even if its taped out doesn't necessarily mean its close to being done...
 
Cut down cost, ati is going to kill you guys again cause of their simple designs and scaleable designs!
not saying nvidia isnt able to make good cards, but strategy wise, i think ati have a edge!

ATI isn't going to be able to compete without their own big chip. Don't expect any less than 2 billion transistors and die area > 400 mm2, for something that can compete with GT300, if these rumors are true.

AtkinS said:
And this is my full option in full honesty, ati is making so much more money with a lower cost per card, and same performance, just see that nvidia, really see what ati does, smaller die, smaller bus, same performance.

No they are not. Sure, they have a cheaper chip, but they are not making "so much more money" and that much is obvious with what's happening in the GDDR5 world, which basically resulted in a $99 HD 4770 having an actual MSRP of $109.

Same performance ? You mean close performance and games. For anything else, NVIDIA GPUs win by a landslide. Just look at folding@home numbers where a single GTX 280 is twice as fast as a single HD 4870. NVIDIA doesn't market their cards for games only and the architectural specs reflect that.
 
Whoa, if this all is true then this is quite a beast. Wondering about the power requirements also.
Because I still hope they make a big part/chip, though not as big as the GT200, something between 500-400 mm^2 would be "enough" with 512bit bus and fast memory. Then if they get that soon revised or shrunk down to a smaller 32nm process with faster clocks it would be really nice!
But then again, maybe next year there is new parts coming, when 28nm process will be possible. It will be interesting to see how this compares to AMD's offering. I so like new and future tech. :D
 
Let's see:
-DX11 ...check
-GDDR5 ...check
-Over 200GB/s memory bandwidht..check
-First new GPU architecture since 2006..check
-Dual precision performance is 8-10 times bigger (depends on clocks) than with GT200..check
-512 shader units, 89% more than in GT200(270SP)...check

GT300 would get that "Next Generation" label anyways. Second Gen GDDR5 would be overkill and would be there just to increase costs.

wow :eek:, gonna miss my current card already
 
Back
Top