NVIDIA Roadmap Outline for 1H08

All I know is that another product cycle is coming around and it appears I still don't have a reason to upgrade from a card I bought in 2006.

All I know is that I bought a card with almost twice the performance for twice less money than my old one. Good times for middle range, bad for the high end.

Anyways if the cycle continues its not good for anybody. Anybody else remember the stagnant state of development around geforce 3&4 and Intels Pentium 2&3 days? But hey, this is what the fanboys dream of, one king on the throne, yeah thats pure gold for the consumer :(
 
Egads, now they start changing the names. What a confusing mess of naming they have made.

On the actual content. No excitement there, I guess this is what happens when you release a standout product that is largely unanswered by your competition, you sit on your ass.

Seriously 8800GTX was released in 2006 and still the only way to really beat it is to do SLI whether with two cards in two slots or with one of those dual cards stuck together GX2 abominations.
 
So to sum up the things a bit..

9800 GX2 looks to be SLI of 8800 GTS cards (2x 65nm GPUs, 2x512=1GB, 2x128 stream units=256). Quoted "at least" 30% over 8800 Ultra could be fine with this, though number is a bit low so maybe they'll be a bit lower clocked compared to 8800 GTS

Next in the speed bin is 9800 GTX. They say it's slower than GX2 (ofcourse), and should be faster than 8800GT/GTS. So we're either looking at only renamed 8800GTX-> 9800GTX, specialy since this is the only one of the new cards that sports tripple-SLI, or at best it's a bit overclocked 8800GTS. Let's face it, if GX2 is 30% over Ultra, and current 8800GTS is pretty close, this one won't be much improved version (so you can forget 512bit memory controlers, maybe just 100-200MHz memory speed bump at most!)

Next in line is 9800GT. No info at all.. Fishy isn't it? By the looks of the above two cards, things are even too fishy, and 8800GT->9800GT renaming pops to my mind. I mean, face reality. 8800GT 512, 8800GTS 512, 8800GTX and 8800Ultra are all in some 10-15% range. Once added 9800GTS, and 9800GT do you really think anything more can be expected? Even if we're looking at 40% difference in performance across range, it's hard to expect much from THIS nVidia.

And so we're left with either 8800GT/GTS being tweaked and renamed to 9800GT/GTX (and quickly pulled from the shelves in current form) or JUST being renamed..

And than we have 9600GT which is already going to the lower end, possibly replacing 8800GT 256MB. New PCB design could just as well be explained by the news that nvidia is "suggesting" to partners to lower PCB layers to cheapen the cards making it more competitive to HD38xx cards.

I won't even bother with 8800GS, since I think it's not even shrink, I'd say they are selling off old G80 inventory and possibly old stock of GPUs with smaller errors where they can still use them if they disable few units and underclock them (memory configs speak in that direction)

My bet would be:
9800 GX2 = 2x 8800GTS
9800 GTX = a bit tweaked 8800GTS 512MB (with possibly 1GB comming later)
9800 GT = more or less renamed 8800GT 512MB, maybe few Mhz boost
9600 GT = bit lower/higher clocked 8800GT 256MB, depending on HD3850 pricing/sales EDIT: coupled with cheaper production boards ofcourse :p
8800 GT/GTS EOL in all versions currently available as soon as 9800's are out
8800 GS EOL in 2 months after they sell those old G80 GPUs
8600 GTS and lower cards will hopefully stay the same cards as they are :p
 
What a mess. Now I know I will wait til 8core, 8ghz cpu's and 8 core gpu's with 400gig solid state hd's become a reality.

No reason for a new build at all now or in the near future.:(
 
For those waiting for a new architecture, why would do think a new would come ?
Seems that you are either new to this or never really bothered to read the evaluations of new cards. New architectures appear, when:

1) The previous one was a failure and thus new cards need to based on something better. This means shorter time frame between new architectures.

2) The process goes as usual, where the current architecture is successful and can be used to base, at most, the next couple generations, while R&D efforts are being done in the background, for something new, that should debut in 2-3 years.

You forgor about

#3) New DirectX version....
 
I agree, I think they are holding back until ATI gets their act together, if they do, and release a competing GPU.
 
well in my previous post i mentioned, ati got their stuff right.

the 3870 will be probaly one of the most powerfull cards in 2008,



please read some lines carefully.

There are CROSSFIRE thats the one thing AMD/ATI runs for not single card, understand it, amd have stated it several times.

With lower clocked videocards they dont need to push the limits of the architecture nor anything like it, they dont need like killer memory to get the bandwidth, the only downside is. some games doesnt support CF very well, ive been loaning a 3870 today, and i can only say.
CF is amazing, it gives sometimes actually over 100% performance increasement, which it by theory should do, but not as much as it gave me 103%, might be my quick head calculations thats wrong...

But minimum 85% as ive seen, mostly 95% performance increasement, makes it more bang for the bucks than nvidia cards, existing at this date.

ATI can do it with CF, intel chipset supports it, amd chipset supports it.
ive not seen people buy nvidia chipsets alot lately either.

Im kinda excited about CrossfireX.
 
If the 9800GTX is indeed faster than that 9800GX2 abomination, I'm upgrading from my 8800GTX. If it's 50% faster, I'll do it. Because, a 65nm proc will be much more OCable than my 8800GTX (which isn't hardly at all.) I hope that will be th new beast for a while.

If they bring out a GTX in Feb or March, it will be AT LEAST 6 more months for their new G100 or whatever flagship card to come out. I can't wait that long.

C'mon nV! Make it FAST.
 
Is it 2006 again? This feels a lot like the 7900 series.

Which if this holds true, ATI will release a new "x1950" that bests the "7900" series, both in speed and in image quality.

Then nvidia brings out the "arch" and we have a complete repeat of the last 2 years.

Yay. :rolleyes:

Color me unexcited.
 
I get sick of people twisting this whole thing around and trying to blame ATI for it. When its Nvidia who is the one doing it, and they are the very ones you should blame.

They are such a wonderful company, they are milking everything they can from their customers.

The ATI cards are NOT that much slower. Sure they aren't top speed like the new GTS or GTX, but they are definitely close enough to the GT in speeds. The ATI crossfire also works better than Nvidias SLI, so I wouldn't be surprised to see their dual gpu cards do better than Nvidias dual pcb's. At least ATI is less expensive.
 
Exactly. For all those that wanted AMD/ATI to go down... Consequences are being seen at Intel and NVIDIA. Why release something 10x faster when the competition is releasing something 2x faster... Keep something in the back just in case, I guess.

Prices better be equal or lower than that the current lineup or I don't see this doing much. I'm looking for a new card, but if the 8800 series drops in price, I'm picking up one of those. The 9800 series just isn't doing it for me. HOPEFULLY, there will be something there that makes me want one. I really would like to want one. Hell, if it was given to me, I wouldn't complain, but I don't know how much of a price premium I would pay to have one.
Your assuming that Nvidia even has anything much faster. Maybe they don't have anything, and thats why the delay.
 
They can't even keep the shelves stocked with 8800's and they're coming out with a new GPU in two months? Why bother? 2x the performance for 3x the price.
 
Once the GX2 is released, I'm sure GTs and GTSs will populate shelves. :p
 
well in my previous post i mentioned, ati got their stuff right.

the 3870 will be probaly one of the most powerfull cards in 2008,



please read some lines carefully.

There are CROSSFIRE thats the one thing AMD/ATI runs for not single card, understand it, amd have stated it several times.

With lower clocked videocards they dont need to push the limits of the architecture nor anything like it, they dont need like killer memory to get the bandwidth, the only downside is. some games doesnt support CF very well, ive been loaning a 3870 today, and i can only say.
CF is amazing, it gives sometimes actually over 100% performance increasement, which it by theory should do, but not as much as it gave me 103%, might be my quick head calculations thats wrong...

But minimum 85% as ive seen, mostly 95% performance increasement, makes it more bang for the bucks than nvidia cards, existing at this date.

ATI can do it with CF, intel chipset supports it, amd chipset supports it.
ive not seen people buy nvidia chipsets alot lately either.

Im kinda excited about CrossfireX.

You're forgetting that dual GPU configurations are a small minority of even the enthusiast market.

And even for the people that do have them, it doesn't make that much sense to buy a second out of date card down the line to get the same performance, and possibly less features, of a single current generation card. It doesn't even make sense from the vendors standpoint because then they are trying to sell current cards and not keep an inventory of older cards in hope that the fraction of users that use dual GPU's want to upgrade an older generation GPU configuration.

Most people that go multi GPU buy two of the fastest, or second fastest, cards out right when they hit the market and then wait for the highend refresh and do that.
 
I am surprised that there is no mention of adding 10.1 support till at least midyear.

It looks that for once (thanks to ATI saying FUTURE versions will Cross up) the 3870 I should be getting next week will keep me stable AND upgradable.

This from someone who keeps buying the last of cards, starting from the 8mb 3dfx when the 12 was announced ready to ship over the radio on my way HOME from the store
 
This from someone who keeps buying the last of cards, starting from the 8mb 3dfx when the 12 was announced ready to ship over the radio on my way HOME from the store

They were talking about 3Dfx cards on the radio?
 
I guess I'll skip this generation and stick with my ultras. Maybe I'll buy ATi's next offering to help boost competition.
 
well, amd have enough power in 2x 3870, they beat out the current 8800 ultra, and i would not say crossfire isnt smart at all, infact my friends with low budget buy a CF board, 1 videocard and upgrades.

one friend bought his 2nd 3870 not long time ago, he bought his first around release.

if you havnt seen the incredible performance CF brings you guys should.

well, costumers that doesnt know much of comps, its kind of hard, theyve prolly never heard of multi-gpu etc, making this have a bad side of the whole multi-gpu instead 1 powerfull idea that amd has at the moment.

http://www.legitreviews.com/article/605/1/
Performance is lovely in CF only gamedevs need to support it though.

http://www.legitreviews.com/article/605/1/

Read in hybrid cf about it, if you havnt at hardocp.
 
If the 9800GTX is indeed faster than that 9800GX2 abomination.. <cut>

It won't be. GX2 is replacing Ultra and GTX is replacing GTX, so unless you see GTX beating Ultra somehow - no go. It could happen in a few "lousy SLI driver" scenarios, but those differences won't be large in GTX's favor. And this is only if single GPU on GX2 will have lower performance compared to single "new" GTX chip.
 
Honestly, a 30% increase after a year-and-a-half or almost two years? Come the fuck on...

Agreed. I bought an eVGA 8800Ultra on November 13th hoping to expect the full-blown "next gen" architecture would be out sometime before February 13th so I could do a Step-Up. But it doesn't even seem like this die-shrink 9800GX2 30% crap card will be out in time for that (given the February - March release window). What the hell is going on over there Nvidia? I waited so patiently for my 8800Ultra, and went to hell and back waiting a year and a half just to have terrible Crysis performance. And you expect that 30% will make a difference? That'll basically bump me up from 18 frames per second to a whopping 24 frames per second with tweaked ultra quality - big whoop. And now I'm forced to play the eBay game with my new card until summer when you get your acts together? Bullshit tenfold.
 
I don't expect a 9800GT or GTS at all. Why? The 9800GX2 is just 2 8800GTS (G92-400).
Maybe they'll rename the 8800GTS to 9800GTS.

I really don't get their product strategy:
- Nvidia develops a new chip (G92)
- Nvidia replaces existing 8800 cards with newer ones (8800GTS G92, ...) instead of introducing them all in a new line (8900, 9XXX)
- Nvidia introduces a new series with newer ones (9800GX2)

First I thought the 8800GT & GTS are for the performance market and the enthusiast cards are sold as 9800 GX2, GTX and so on.

But then Nvidia announced a 9600GT which is slower than a 8800GT.

Whats up with Nvidia? Thats totally confusing.

Maybe Nvidia has problems with the G100? Or the G92 didn't turn out as expected (temperature and power consumption).
 
I don't expect a 9800GT or GTS at all. Why? The 9800GX2 is just 2 8800GTS (G92-400).
Maybe they'll rename the 8800GTS to 9800GTS.

I really don't get their product strategy:
- Nvidia develops a new chip (G92)
- Nvidia replaces existing 8800 cards with newer ones (8800GTS G92, ...) instead of introducing them all in a new line (8900, 9XXX)
- Nvidia introduces a new series with newer ones (9800GX2)

First I thought the 8800GT & GTS are for the performance market and the enthusiast cards are sold as 9800 GX2, GTX and so on.

But then Nvidia announced a 9600GT which is slower than a 8800GT.

Whats up with Nvidia? Thats totally confusing.

Maybe Nvidia has problems with the G100? Or the G92 didn't turn out as expected (temperature and power consumption).

G92 is not a new chip, its just a refined G80 chip nothing else.
Thats why i probably skip the whole 9800 crap if its all tru that 9800 is just a highbinned G92 corecard. i Want the G100 or whatever the real deal is based on.
Really hope this will be the end of G92 crap and in mid 2008 put the real deal out.
I blame ATI fot this, they are the reson the good stuff is not out yet.
 
Some people say the upcoming 9800GTX will be based on the G100. But I think thats really unlikely.
 
Some people say the upcoming 9800GTX will be based on the G100. But I think thats really unlikely.

Some do others say just highbinned G92 so i dont keep my hopes up to mutch until we know and i blame lack of ATI competion for the whole mess until a worthy replacement for 8800GTX/Ultra arrives.
 
Personally,Im pretty damn pleased that my 8800GTX has just spent 1 year in my rig and is still king of the hill.I mean really.do we need to have a next gen part?/For what..Crysis??Every other game right now is pretty much no sweat to my 8800GTX..I wish for better drivers and for NV to get DX10 up to snuff..My .02...

Agree with 115% of what you have to say. The new 98xx may be a refresh of the current 8800 series, but Nvidia will keep tweaking the drivers for the 8800series as well.
 
Lets all just thank AMD/ATI and lets wait almost 2 yrs before we see a decent upgrade from the 8800 series.
 
If the 9800GTX is supposed to replace the 8800GTX, then I doubt it will be G100 as that should beat a 9800GX2, which is still based on the G92. I get the feeling that if there are 9800GTS's and GT's, they will be the current ones rebranded while the 9800GTX will be a revision of the G92...

Hopefully that is not true at all and hopefully R700 kicks some butts so Nvidia actually, you know, does something
 
If the 9800GTX is supposed to replace the 8800GTX, then I doubt it will be G100 as that should beat a 9800GX2, which is still based on the G92. I get the feeling that if there are 9800GTS's and GT's, they will be the current ones rebranded while the 9800GTX will be a revision of the G92...

Hopefully that is not true at all and hopefully R700 kicks some butts so Nvidia actually, you know, does something
I really hope that this follow the 7-series scheme and the 9800Gx2 is a short lived sucker that ends the whole G92-chip mess and in maximum two-three month during spring the real G100 as 9850 Ultra or better yet 9900 Ultra come and kick some serious DAAMIT butts if 9800GTX not is what "everybody" waiting for, since yet another G92-based card is not it.
 
Do they need all these cards?

"I could use my lunch money to get a faster card than I was planning. Should I spend $10 more to get the 9900 Pro Ultra oc gtx even though it's only 2% faster than an 8800GT ? Is it worth the stretch for 1 fps more in Crysis?"
 
I run 29fps in Crysis demo. So I'll probably need something twice as fast to really care about upgrading. That's all on high but without AA. Everything else runs fine on high so I guess I'll just play everything else. In the meantime I'll wait for a card that's twice as fast, get it, then get Crysis.
 
Many people seem to be thinking that the reason Nvidia is not releasing any faster performance level cards this year as per their typical 6 month tick tock development cycle that we have all grown so used to is a lack of competition from AMD. Maybe that is true. But there are other possibilities.

A couple of possibilities that I find interesting are:

1. The End of Moore's Law for GPUs.

It is possible that the reason both Nvidia and AMD/ATI are no longer able to stick to such aggressive 6 month tick tock cycles is that they can't. It may simply be that the engineering problems are becoming so difficult that a tick may now take 12 months and a tock may now take 18 months. This stuff aint exactly easy folks. Scientific advances are not actually automatic even if that is how they sometimes seem.

Of course if this is the case then I wonder why they cannot just admit to that and ask us all to be patient instead of releasing slightly tweaked cards with new generation numbers. I find this possibility disturbing but also a bit unlikely since they are already down to 65nm with the 8800 GTS 512 which has been intentionally crippled with it's 256 bit memory bus. They don't really need a new architecture. All they really need is to die shrink the 8800 ultra to 65nm and clock it higher and maybe even make it compatible with Tri-SLI. It would be real progress and would satisfy most of the enthusiast market and they could start selling it at $899 MSRP and drop it to $699 in a few months. Tick.

2. There is no enthusiast market aka money talks, bullshit walks.

Maybe both Nvidia and AMD/ATI have realized that their super expensive high end cards are only of interest to 0.0000000001% of the video card market and that the so called halo effect is of no real importance to the bottom line. If mainstream cards are 99.999% of the market then why even bother with an enthusiast level card at all? It is so much easier and so much less expensive to focus on what reallly matters to the average buyer: price. So perhaps both companies are spending their engineering dollars on figuring out ways to cut costs rather than push graphics/rendering technology forward. We are all so worried about AMD. Maybe we shouldn't be. Maybe it is Nvidia that is playing catch-up with AMD and not vice versa. This kind of dismissive attitude towards the uber-GPU card might also explain today's AMD. To many of us it seems like they are bumblers. Shouldn't we at least entertain the idea that both companies are just giving all of us 'enthusiast' gamers the one finger salute?

There are two kinds of companies in this world. There is the company that wants to accomplish real things in the world, to change the world for its own sake. To move technology forward. To create new and amazing products. Any profits that are made from those products are merely secondary. Certainly very nice, but not what is driving the company. It is just a way to make a living while creating new tech.

And then there are the companies that we are all so familiar with who *only* care about the bottom line. Only care about profits. Maybe when they started out there was a founder who actually had a vision about what he wanted the company to accomplish, but perhaps he is long gone and the company is only kept in motion to keep the paychecks and stock options coming in. Such companies would only innovate when they were forced to by outside circumstances. They only do what is most profitable because they have no vision or passion. They are motivated only by greed and nothing else.

So what kind of companies are Nvidia and AMD/ATI? I think we are finally seeing that now.
 
What is the definition of an enthusiast card? All I want is to play every single game on high at my monitors native resolution. Am I mainstream or enthusiast?
 
What is the definition of an enthusiast card? All I want is to play every single game on high at my monitors native resolution. Am I mainstream or enthusiast?

You're a heavy enthusiast, unless you've got 800x600 display :D Ok, you could pass off as mainstream with 1024x768 as well :p But it's hard to expect anyone having this low resolution since I bet you're not gaming on 15" LCD or very very very old CRT.

EDIT: Ofcourse, i'm joking, just to make it clear :) Cos there's no mainstream card that can run EVERY game on high, as long as we have (unpached?) Crysis out there..
 
i said i was giving up on nvidia to my friend and he replied.


I have friends who used to write code for the graphics drivers for BFG graphics cards and from what I hear is that technology is simply getting way to advance for what programers are able to do. So in order for them to incorporate all new technology into their products, program writers have to learn all new 3d base articheture, shader model, quantum effects, and so on. It makes no difference if Nvidia allows the use of triple of scaleable link interface (SLI) if game developers, CAD users, and others are not able to use them at maximum potential. I believe the strategy now is to slow down the process of hardware to meet the demand of software. This is the same reason the 64bit Windows Operating System hasn't taken off. Even though it is superior to the 32bit, the software like drivers support, programs, and utilities software can't meet the requirements of a 64bit OS.

i happen to believe this because he wouldn't lie about this stuff. and he was right about the short life expectancy of a 7900series.

i do however think it is weird nvidia would name there next series a 9 series if they don't bring anything new to the table. they could have easily made the 8800gt and 8800gts(512mb) a 9 series but they decide not to. i think there is more to this 9800gtx then people think. and judging buy the 8800gt which is a die shrink i wouldn't be surprised at all if the 9800gtx is at least 40% faster then the ultra. maybe this site is completely wrong you never know. and nvidia had nothing to say which i love about them. but anyhoo the end of the GPU draws closer and closer it has almost reached its limits. soon a smarter chip will be released intells cGPU. then nvidia will die, that is of course they dont already know how to build this technology. and if that happens it WILL suck for all consumers because price will be inflated beyond beleif. intell is a master at doing this already with there extreme quad cores.
 
i said i was giving up on nvidia to my friend and he replied.


I have friends who used to write code for the graphics drivers for BFG graphics cards and from what I hear is that technology is simply getting way to advance for what programers are able to do. So in order for them to incorporate all new technology into their products, program writers have to learn all new 3d base articheture, shader model, quantum effects, and so on. It makes no difference if Nvidia allows the use of triple of scaleable link interface (SLI) if game developers, CAD users, and others are not able to use them at maximum potential. I believe the strategy now is to slow down the process of hardware to meet the demand of software. This is the same reason the 64bit Windows Operating System hasn't taken off. Even though it is superior to the 32bit, the software like drivers support, programs, and utilities software can't meet the requirements of a 64bit OS.

i happen to believe this because he wouldn't lie about this stuff. and he was right about the short life expectancy of a 7900series.

i do however think it is weird nvidia would name there next series a 9 series if they don't bring anything new to the table. they could have easily made the 8800gt and 8800gts(512mb) a 9 series but they decide not to. i think there is more to this 9800gtx then people think. and judging buy the 8800gt which is a die shrink i wouldn't be surprised at all if the 9800gtx is at least 40% faster then the ultra. maybe this site is completely wrong you never know. and nvidia had nothing to say which i love about them. but anyhoo the end of the GPU draws closer and closer it has almost reached its limits. soon a smarter chip will be released intells cGPU. then nvidia will die, that is of course they dont already know how to build this technology. and if that happens it WILL suck for all consumers because price will be inflated beyond beleif. intell is a master at doing this already with there extreme quad cores.

so why couldnt crysis make use of these technologies? im sure that IF THEY spend more time in optimising there code, it will work the way its meant to on todays hardware
 
I have friends... ...can't meet the requirements of a 64bit OS.
I agree with this paragraph. If you look at the raw performance of these new graphics cards, their output is amazing. Software development/evolution needs to parallel that of hardware or otherwise you run into what we have now: games that are beautiful but take a ton of time to program and the ending game gameplaywise isn't even that great due to the resources being sucked up by the engine/graphics department. What happens as games get more detailed and coding doesn't advance? I don't want to wait ten years for a next generation game just because it has realistic graphics. It'd be nice to see hardware manufacturers give even more support to developers so that it would make it easier to code for their hardware. I know I've seen nVidia's developer kit(s), but is there more they can do? I'm not well versed in this area.

Anyway, I wouldn't mind nvidia taking a break and using time to help software developing, but there's probably not much money in that when compared to selling cards.
 
gpus will die by 2010 i can smell it. they have slowed down dramatically and are not really improving to much. they have shrunk and increased the hz which will provide slightly better fps in games but i don't think there is much more gpus can do other then become bigger and stronger. sooner or later when Intel releases there cGPU larrabe or what ever its called. nvidia will probably die or team up with amd/ati because they will not be needed by Intel anymore. basically Intel will become a power house and prices will be inflamed. this is just speculation of course of what i have read from 06-present. gpus are going to die because it is time for them to evolve into something smarter and cheaper. well cheaper for the manufacturers. and think of it this way as well, consoles last 3-5 years right pc last 6months-2 years for maximum gameplay if that. i am seriously thinking that only 60% of the 8800ultra is being used at the moment. and 40% of it is being put to waste. (this probably sounds stupid because my freind didnt write this i did) this is what i think will happen.
 
I dunno about the 3-5 years thing. If you get a console on day one, it will last that long, but if you don't... and besides from that, your PC can last 3 years- it will just display the same quality it did when you got it (what was once High would then be Low)- which is no different from the consoles. Look at Crysis even- the 6800 series is the minimum spec for it, and that came out roughly three years before Crysis- and, hell, my friend is using a 6800 Ultra to run the game on Medium at 1024x768 at a decent fps. The only piece of hardware that doesn't really scale three years back for Crysis is the cpu- but that's really because of the introduction of dual cores, which was as much of a shift for PC's as the shift from the Xbox to the Xbox 360 was for the consoles. And, of course, I'm talking about Crysis, which isn't your average game as far as system requirements go. And, hell, my rig runs Crysis on High at ~25fps and much of it is three years old (less of it now than before, but let me give you my prior specs which ran Crysis at ~23fps on High- the only things not three-years old in my computer were my gpu and cpu). Of course, if things stay the way they are, PC gamers also save $10 (at least) on every game- that alone has already paid for a few of my upgrades.
 
Back
Top