Why No 512bit Chip for 9800GX2, GTX

Caliche

Limp Gawd
Joined
Jun 1, 2004
Messages
356
I just need some insight. I thought the nex-gen cards after the 8800 series, was suppose to have a 512bit chip that would be accompanied by Billion Transistors. It would amount to an End All Killer of vid. cards. What the heck, the rumors and specs are flying and No 512bit chip, just a 256 bit chip? hhhmmm! Unless there was a major technology hurdle, for what I was reading this past and last year there was when going to 512 and 1 Billion in transistor count. Anyways, any of you all, got an educated opinion on this? Cuz its seems like Nvidia step backwards in a way. Or maybe that is why the Performance is so crappy with the new cards, and it is just not Drivers. Just a thought.
 
Most likely the cards will not benefit from having such a huge memory bus on it. The 256bit doesn't really bottleneck the cards and having it helps keep production and heat output down.

You can look to the 2900XT for example. It had the 512bit bus, but the 3870 performed just as well if not better on occassions. All the 512bit bus did for the 2900xt was make it heater and a power hog.

The GT200 will probably use the 512bit if Nvidia sticks with the GDDR3 scheme. If they used GDDR4/5 they probably could use the 384bit to keep costs down.
 
Maybe Nvidia learned from R600 as it had so much bandwidth but rarely used by any game. The Geforce 9 series is just a revision , maybe we will se 512 bit cards in Geforce 10.
 
I'm more disappointed that nvidia didn't deliver the DP FPUs it promised last year. Maybe it is in the G9x series, but it's not enabled.

It wouldn't really do much for games, but it would have been an incredible boost for GPGPU applications. I guess that feature is now delayed until "GT200." :rolleyes:
 
I think the 9600GT is proof that the 256 bit bus is bottlenecking the higher end cards. It gets very close to 8800GT despite having half the shaders. But it does have 256 bit bus to go with those half shader count.

Further evidence is of course 9800GTX with it's uninspiring performance.
 
yeah i expected at least 512 bit bus with new nvidia cards but seems there won't be any :( gtx is still faster than 8800gts 512 and i think thats because of memory bandwith its bottlenecking gts
 
NV chips need more memory bandwidth than ATi ones, this can be concluded by looking at R600 and RV670 v G80 and G92.

Their is two schools of thought on WHY the 9800GX2 and GTX aren't anything higher than 256bit memory bus.

Some say the chip isn't architecturally capable of running more than 4 x 64 bit memory channels. Their isn't a shred of evidence confirming this point of view other than to say "look at the cards out now the G92 chipset". This may be true, but this can't be confirmed using any evidence available to us on the internet.

The other school of thought is that their not using a bigger memory bus in order to save money on overall production. You save on a 1/3rd less memory chips, you also save on the number of PCB layers you 'need', you also save on overall power requirements.
 
The other school of thought is that their not using a bigger memory bus in order to save money on overall production. You save on a 1/3rd less memory chips, you also save on the number of PCB layers you 'need', you also save on overall power requirements.
Ding ding ding! My guess would be with little competition from ATi, nVidia is putting out "decent" cards that have a great profit margin.
 
Maybe they will surprise us all with a 9800 Ultra with 1GB and a 512bit bus. That thing would be fast! I was hoping the 9800GTX would fit that bill, but I guess not.
 
Aside from profit margins, I think nvidia chose to stick with the 256 bit interface after learning from ATi's mistake. Going to a 512 bit interface was certainly a big reason why the 2900xt was a hungry inefficient card. By the time ATi realized this, it was probably too late, as the gpu was already tapped out.

And now that it's obvious that the 9 series is nothing but a profit series, there's no reason to cripple the 9800xxx with higher consumption for only marginal performance gains and higher production costs.
 
or just keep the 384bit bus that the G80 GTX and Ultra use.. no need for 512 atm.. and 256 bit low..
 
Most likely the cards will not benefit from having such a huge memory bus on it. The 256bit doesn't really bottleneck the cards and having it helps keep production and heat output down.

You can look to the 2900XT for example. It had the 512bit bus, but the 3870 performed just as well if not better on occassions. All the 512bit bus did for the 2900xt was make it heater and a power hog.

The GT200 will probably use the 512bit if Nvidia sticks with the GDDR3 scheme. If they used GDDR4/5 they probably could use the 384bit to keep costs down.
the 256bit does bottleneck the high end cards. thats why you see the 8800gtx beating the 8800gts 512mb in most high res situations even though it has less TMUs and is clocked lower.
 
Aside from profit margins, I think nvidia chose to stick with the 256 bit interface after learning from ATi's mistake. Going to a 512 bit interface was certainly a big reason why the 2900xt was a hungry inefficient card. By the time ATi realized this, it was probably too late, as the gpu was already tapped out.

And now that it's obvious that the 9 series is nothing but a profit series, there's no reason to cripple the 9800xxx with higher consumption for only marginal performance gains and higher production costs.

People don't not buy video cards because they use too much power (unless your running an HTPC) people buy video cards based on performance, the 8800GTX performed better than the 2900xt at every situation that required AA and most situations without, except for of course 3dmark06. The fact that it burned 50-60 more watts of power than the 8800GTX was just sugar on top of the failure that was the 2900xt.

Power consumption isn't why NV isn't using a bigger bus with more memory, its cost of production / performance ratio, if power consumption was a factor, and they did want to talyor for it, they would use GDDR4.
 
They're looking to maximize their profit. I can't say I blame them as people will buy just about anything even when it doesn't work properly.
 
Short answer: poor yields. That's why you still see 8800 Ultras going for $899 two years out.
 
512 might be overkill, but they should use 384 bit, you can see that 256 bit is a bottleneck, thats why despite the g92 gts and gt having high speeds and 112/128 spu's, the gtx and ultra are still on top at higher resolutions and lower clock speeds.
 
If they use GDDR5 with GT200..then there's no reason why it wouldn't use 256-bit memorybandwith. I mean Samsung delivered 2.5GHz (5.0GHz DDR) test samples for Nvidia and AMD half a year ago.. 256-bit mem bus + 5.0GHz DDR = 160 GB/s (compared to HD2900 XT with 512-bit mem bus and it's 105.6GB/s)
 
Back
Top