gt300 spec leaked

wow. a heck of a gamble on nvidia's part. dropping down to 40nm manufacturing and pretty much reinventing the wheel according to this article. i guess their only design retread from this generation will be borrowing ati's narrow 256bit memory bus plus gddr5 for lower costs and high bandwidth. good thing they have an in with game developers to make sure their new tech works before releasing it into the open market.
 
I expect that will be a scorcher of a card. But I want to see what ATI gets out as well. Fun times!
 
Could this have the impact in 2010 as the 8800GTX did when it was released?
 
Last edited:
wow. a heck of a gamble on nvidia's part. dropping down to 40nm manufacturing and pretty much reinventing the wheel according to this article. i guess their only design retread from this generation will be borrowing ati's narrow 256bit memory bus plus gddr5 for lower costs and high bandwidth. good thing they have an in with game developers to make sure their new tech works before releasing it into the open market.

for the flagship card, they might do 512 bit bus and gddr5
makes sense, they're tripling the computing power, might as well do the same with the bandwidth
 
I love next gen tech as much as the next guy, but I just don't see a reason for this card quite yet. what exactly can I not play right now to need another $600 card purchase? I mean, I know I will get one as soon as it comes out, but I will still question myself and I hate doing that :) I hope some new games come out that are worth playing and will use this card to its potential
 
All the power may come in handy when dx11 games finally come out, future-proofing. But do enthusiasts need a reason to have more power? :cool:
 
I love next gen tech as much as the next guy, but I just don't see a reason for this card quite yet. what exactly can I not play right now to need another $600 card purchase? I mean, I know I will get one as soon as it comes out, but I will still question myself and I hate doing that :) I hope some new games come out that are worth playing and will use this card to its potential

well the card's not for you :p it's aimed at the guy with the 8800ultra saying "i don't see a reason for gtx 285 quite yet. what exactly can I not play right now to need another $600 card purchase?"
 
Man, 2560x1600 on a single card would be awesome :D Good find OP
 
the GTS250 is a 9800GTX rebadged

Actually it's a 9800GTX+ which uses less power and has a die shrink and a shorter PCB.
Otherwise, it's the same card.

8800GT = 9800GT = GTS150

9800GTX+ = GTS250

Yeah, NVIDIA needs to step up it's game w/ new tech or just not rename the same exact cards.

But GTS250 sounds better than 9800GTX++, lol.
 
i hope both ati and nvidia release monsters that perform top notch and almost the same, that way we get lower prices and more powerful cards lol
 
I'm just looking for 60fps min @ 1920x1200 w/8xAA 16xAF
Crysis
Crysis: Warhead
Operation Flashpoint:Dragon Rising
World in Conflict
Company of Heroes
Rage
Battlefield: Bad Company 2
Formula 1 (under development by Codemasters)
Need For Speed: Shift
COD:MW2
DOOM4

All while using Nvidia's new 3D glasses system.
 
Interesting, though I'm curious about its clock speeds, memory bus, bandwidth and what type of GDDR its using.
 
Interesting, but far-fetched. In fact, just as far-fetched as all this "128-bit CPU" jazz, this "Larrabee will pwn all" jazz, and this "1 master chip and 4 slave chips" jazz. Are you serious? The next thing we're gonna see is GPUs with L1, L2, L3, and L4 caches, SSE5, and dual physx-dedicated processors at post-1GhZ speeds. Duh.

We're prolly looking at 320-384 stream processors on a 448-bit bus with GDDR5 with a few new instructions being added. It would be dumb for Nvidia to go with 500+ stream processors running an entirely new architecture with a 512-bit GDDR5 memory subsystem. The enthusiast market doesn't have as much money to spend in the current economy, and to see this card produced would mean consumer prices back where the 8800 Ultra came from: $700+ territory. WHY would either company do that? They aren't producing as many of their current trump cards (4890, gtx 275) for this same reason, and these cards are resetting the bang vs buck level.
 
Think nvidia are about to open up an almighty can of whoopass here.

Probably been working on this a long time whilst they've been milking G92 :D
 
well the card's not for you :p it's aimed at the guy with the 8800ultra saying "i don't see a reason for gtx 285 quite yet. what exactly can I not play right now to need another $600 card purchase?"

I had an 8800 ultra too :) I think I have a problem..

I just hope there are games that actually utilize the card to its full potential, if it really does get manufactured as this beast
 
its going to be funny seeing people with cpus that are already holding back their current video cards buying this.
 
I'm just looking for 60fps min @ 1920x1200 w/8xAA 16xAF
Crysis
Crysis: Warhead
Operation Flashpoint:Dragon Rising
World in Conflict
Company of Heroes
Rage
Battlefield: Bad Company 2
Formula 1 (under development by Codemasters)
Need For Speed: Shift
COD:MW2
DOOM4

All while using Nvidia's new 3D glasses system.

I think you're going to need 120FPS min. They use 120Hz screens for a reason it's because they need 120 redraws 60 per lens. So sadly the glasses would be great but for any crazy games it could end up pretty bad.
 
This might make me upgrade from my G92 architecture.... Oh wait, that money has to go to a house :(
 
This looks like a joke.. Doubt it is real, would have to be a stupidly massive chip, with less then stellar clock rates, and a rater brutal harvesting needed.

Not to mention, BSN? BS News?
 
Holy shit a true G80 successor, if true. Sign me up for two. If ghey aren't priced stupidly, ie more than 400 each.
 
Could this have the impact in 2010 as the 8800GTX did?

I felt that impact in November 2006...still happy about it..still using it happily...no complains...not happy of their current pricing though compared to how much I payed the week it came to market. XD

If the 300 series is a true successor to the G80 I am going for it...
 
I am one of the "My 8800 still runs everything" guys. Am I angered I paid $500 for the card back in 06? Nah.

But this GT300 shows leaps and bounds of gp evolution here. Sweeney really was right.

When do we program Skynet?:D:D:D
 
its going to be funny seeing people with cpus that are already holding back their current video cards buying this.

Looks like it could be very cool. I agree, with gpu's already being held back by cpu's today will we really be able to use the processing power?
 
Holy shit a true G80 successor, if true. Sign me up for two. If ghey aren't priced stupidly, ie more than 400 each.

its nvidia you are talking about.. :p

when did they ever release a high-end card lower than 400 when launch :D
 
I got a feeling this is going to be what I buy for my comp. Wow technology is sure getting slow. I remember before in 4 years I tend to buy a whole new comp but this time my q6600 still doesnt seem to have to be replaced.
 
I had already planned and budgeted to get two of whatever these will be when launched... this news is just enthralling. Though the cynic in me wants to point out past instances where vast leaps were promised only to yield marginal or even evolutionary increases, I think we all know that a fundamental industry-wide architectural change is inevitable. Maybe nVidia will simply be the first to bring it to market.
 
Sucks to be Nvidia right now. ATI just set all of their scientists, engineers, and janitors to 65 hour work weeks.
 
I had already planned and budgeted to get two of whatever these will be when launched... this news is just enthralling. Though the cynic in me wants to point out past instances where vast leaps were promised only to yield marginal or even evolutionary increases, I think we all know that a fundamental industry-wide architectural change is inevitable. Maybe nVidia will simply be the first to bring it to market.

Idk about that decision. The 5 series was the first to adopt DX9 and those cards were pieces of crap especially when the 6 series was released 6 months later.
 
I had already planned and budgeted to get two of whatever these will be when launched... this news is just enthralling. Though the cynic in me wants to point out past instances where vast leaps were promised only to yield marginal or even evolutionary increases, I think we all know that a fundamental industry-wide architectural change is inevitable. Maybe nVidia will simply be the first to bring it to market.
2 of these would have you so cpu bound that its not even funny. you would be better served with a better cpu and more memory and just 1 of these if the specs are true. 2gb of system memory and anything other than i7 would be silly on 2 of these beasts. remember the Phenom 2 article on here that used 3 8800gtx cards in sli that showed where the i7 just beat the shit out of Core 2 and Phenom 2 cpus. well this new card looks to be stronger than 3 8800gtx in SLI so do the math.
 
2 of these would have you so cpu bound that its not even funny. you would be better served with a better cpu and more memory and just 1 of these if the specs are true. 2gb of system memory and anything other than i7 would be silly on 2 of these beasts. remember the Phenom 2 article on here that used 3 8800gtx cards in sli that showed where the i7 just beat the shit out of Core 2 and Phenom 2 cpus. well this new card looks to be stronger than 3 8800gtx in SLI so do the math.

Yeah don't look at my current rig as being representative. I've got an i7 920 waiting to be OC'ed sitting in my closet. Just sold the guts of this rig to my gf just need to rip it all out and put them in her CM690. I figured the 920 OC'ed to 4ghz neighborhood should be enough of whatever they're offering next. So with an empty case, a pile of money, and games screaming to be played, why not buy a couple of these? If I'm not mistaken, that's what [H] is all about.
 
Do you have a link to that article? I got to show it to a friend of mine
 
its going to be funny seeing people with cpus that are already holding back their current video cards buying this.

cpu's do NOT hold back my gaming, it is all VideoCard at my res 2560x1600. A $1500 eXtreme processor vs a $99 budget Intel C2D make very little difference at this res. I have seen million bench marks at my high res and it is 95% VideoCard
 
Weren't there some tests that showed that the i7 CPUs gave a huge FPS boost over all the Core2 CPUs even at the highest resolutions? I'll see if I can find the link.
 
Back
Top