GT300 Release?

This is what will likely happen... would stake my reputation as a nameless, wild internet forum speculator.

  • nV will release miniscule quantities of GT300 (I'm talking 1000 - 3000 units... globally) end of Nov./early Dec. This will allow nV to1) satiate fanboys and those holding off buying 5870 2) get reviewers to blather the "nV is back in the game" statement 3) let them claim to shareholders they released holiday season '09.

If they release anything it will be press review samples only. Shareholders do not give a damn about when an actual product is launched by the way. They care about quarterly revenue figures. Not future earning projections of unreleased products.

  • MSRP will be commeasurate with performance, meaning if it's +125% 5870 it will cost $450. However, extremely low supply will drive the price well over $500 retail. We're looking at a similarly low curve to the one we saw with the 7800 GTX 512. Remember the mad grappling over that one?
If there is even a retail presence it will only be enough to cover the hardcore fan base and those with money they can throw away. Will it be enough to make others wait? Who knows. It may have the opposite effect and convince others to buy the competition instead.
Also ATI is most likely expecting this. If you check carefully you will see a wide gap in pricing between the 2 high end offerings from ATI and their lower end chips. This gap is purposely there to drop the price of the 5870-5850 when whatever Nvidia releases is actually available in sufficient quantities to pose a challenge. Remember ATI's chips are far cheaper to manufacture because they are smaller. I am waiting for this to happen to jump in on the price war. But with Nvidia failing so miserably to execute their designs throughout this and the better half of last year, I don't see it happening. In the end consumers lose.

  • Inventory will increase drastically in late January or February -- not because nV is flooding the channel, but more likely, because many people will have already bought ATI at this point (people need something under the tree from Santa, right?)

Probably, what I am afraid is that with ATI's having no competition prices will stay high. All of that is assuming Nvidia actually releases anything before the end of the year. That's still up in the air.

It's not looking good, but there are still a couple ways nV could come off ok:

  • GT300 and midstream derivations massively outperforms the R800s. I think GT300 is already beat on features when considering EyeInfinity -- unless it can do the dishes and wash my car.
It had better be faster because it is coming out months later otherwise it is an utter failure. ATI's offerings have gotten away with being slightly slower because their price/performance ratio has been unmatched by anything Nvidia had at each price bracket.

GT300 derivatives will not be out until at least the second half of next year. What we will get will be directx10.1 40nm versions of the 200 series to cover the mainstream which WILL come out before the end of the year.

  • nV can price GT300 at a better price/performance.

It is not possible. Simple logistics make it so. TSMC's 40nm wafers cost the same for both Nvidia and ATI. Nvidia's bigger chip means less dies per wafer. But they can sell gt300 at a loss and see what investors think about that. I hope they do that so that ATI lowers prices even more and we as consumers win. I wouldn't hold my breath waiting for that though.
 
The GT300 wood screws will help with the cost savings. They need every bit of help this round as its obvious they will have to sell at a loss being behind and avoid the hit in market share.
 
Despite the thread crapping, here is some of the speculation around Fermi's (GF100) graphics related bits that are still unknown:

128 TMUs
and 48 ROPs

A considerable increase over GT200.
 
Is anyone else tired of these useless supposed "leaks" of info. It is time for Nvidia to put up or shut up. They have been failing miserably on execution and nothing they say has any weight 'till we see hardware and numbers. It's as if fanboys were doing the PR damage control for them. Ridiculous.
 
Is anyone else tired of these useless supposed "leaks" of info. It is time for Nvidia to put up or shut up. They have been failing miserably on execution and nothing they say has any weight 'till we see hardware and numbers. It's as if fanboys were doing the PR damage control for them. Ridiculous.
It's cheaper that way. :p
 
Is anyone else tired of these useless supposed "leaks" of info. It is time for Nvidia to put up or shut up. They have been failing miserably on execution and nothing they say has any weight 'till we see hardware and numbers. It's as if fanboys were doing the PR damage control for them. Ridiculous.

I don't see the problem with waiting here. Just because ATI came out with their card doesn't mean they have to rush something out on the market just to satisfy a small segment of users. Companies like Valve or Blizzard comes out with games when they feel like its ready. I'd rather have a product in which I KNOW I'm getting my money's worth rather than having some shotty pos. They don't need to explain anything to you or any of us. Just as long as they keep their head down on the work, I'm fine with that.
 
You know, Nvidia still has an ace up their sleeve. There's one move they could pull, right now, that would keep a lot of their current Nvidia card owners from jumping to ATi (and possibly sell a bunch of current-gen nvidia cards).

Enable SLI Mosaic Mode on GeForce cards (8, 9, and 200 series) in a driver update.

This is a special SLI mode available on Quadro cards that allows 3D applications to be spanned across multiple monitors, using the outputs on two graphics cards.

Basically, if they were to enable this in the GeForce drivers, you could toss in a second video card and you would have Nvidia's version of Eyefinity, while also keeping the ability to run PhysX. Considering how cheap nvidia cards are getting, getting a second card for SLI might not carry anywhere near the price premium of a 5870, either.
 
One problem is that Nvidia's cards ARE NOT getting cheaper. Go check newegg and see the prices of 260's, 275's, 285's and 295's, they haven't budged. The cheapest gtx285 is $319.
Also if they gave you that eyefinity like functionality they are effectively giving you a reason NOT to upgrade to their latest product. Which is definitely not what they want.
 
One problem is that Nvidia's cards ARE NOT getting cheaper. Go check newegg and see the prices of 260's, 275's, 285's and 295's, they haven't budged.
You sure about that? I purchased my GTX260 for $280, you can get it now for as little as $155:
http://www.zipzoomfly.com/jsp/ProductDetail.jsp?ProductCode=10010370-OP&prodlist=celebros

If SLI Mosaic Mode were enabled, that would make it a $155 upgrade to get native multi-monitor gaming instead of a $380 upgrade for a HD5870. Also, last I checked, GTX260 SLI managed to get fairly close to an HD5870 in performance.

Also if they gave you that eyefinity like functionality they are effectively giving you a reason NOT to upgrade to their latest product. Which is definitely not what they want.
The idea is that it would keep users from jumping to ATi in the interim, since new Nvidia cards wont be ready for a while yet. In the short-term, enabling the feature would help them.

There's an added bonus, in that it also brings in sales from those who will upgrade to SLI for the new feature.

Sure, it gives you less reason to want a a next-gen nvidia card, but it also gives you less reason to want a HD5870 ;)
 
Last edited:
Is anyone else tired of these useless supposed "leaks" of info. It is time for Nvidia to put up or shut up. They have been failing miserably on execution and nothing they say has any weight 'till we see hardware and numbers. It's as if fanboys were doing the PR damage control for them. Ridiculous.

It would be better if the red fanboys stopped thread crapping on a thread about GT300 release and rumors...

But you wouldn't know anything about that would you ?
 
You sure about that? I purchased my GTX260 for $280, you can get it now for as little as $155:
http://www.zipzoomfly.com/jsp/ProductDetail.jsp?ProductCode=10010370-OP&prodlist=celebros

Umm no shit, it better be cheaper now. What I meant to say (I assumed you'd be smart enough to deduce but I'll spell it out for you), is that prices haven't dropped since the release of the HD5xxx series

If SLI Mosaic Mode were enabled, that would make it a $155 upgrade to get native multi-monitor gaming instead of a $380 upgrade for a HD5870. Also, last I checked, GTX260 SLI managed to get fairly close to an HD5870 in performance.

No, It would mean that you have paid a total of $435 for a year+ old solution with slower performance than a 5870. It would also draw more power, cause more heat and noise. But it is your prerogative how you spend your money. You can sell that 260 to offset to cost of the upgrade, that is, unless you are a die hard fan and must have the same brand.

The idea is that it would keep users from jumping to ATi in the interim, since new Nvidia cards wont be ready for a while yet. In the short-term, enabling the feature would help them.

There's an added bonus, in that it also brings in sales from those who will upgrade to SLI for the new feature.

Sure, it gives you less reason to want a a next-gen nvidia card, but it also gives you less reason to want a HD5870 ;)

You are thinking like a child (for lack of a better analogy). Businesses do not prosper and grow based on spite or on the concept of scorched earth. In this case of destroying their own market just to spite a competitor. But this is Nvidia we're talking about, so I wouldn't put it past them to actually do it. They already screwed their own customers who jumped ship by disabling their cards when used in conjunction with the competition.
 
Last edited:
No, It would mean that you have paid a total of $435 for a year+ old solution with slower performance than a 5870. It would also draw more power, cause more heat and noise. But it is your prerogative how you spend your money. You can sell that 260 to offset to cost of the upgrade, that is, unless you are a die hard fan and must have the same brand.
$435 ($280 for old XTG260 + $155 for new GTX260) is still a lot less than $680 ($280 for old GTX260 + $380 for an HD4870).

I wouldn't sell the GTX260, I would throw it in my second slot so I wouldn't lose PhysX. I already have it, might as well use it.

You are thinking like a child (for lack of a better analogy). Businesses do not prosper and grow based on spite or on the concept of scorched earth.In this case of destroying their own market just to spite a competitor. But this is Nvidia we're talking about, so I wouldn't put it past them to actually do it. They already screwed their own customers who jumped ship by disabling their cards when used in conjunction with the competition.
Think before you speak. You're being short-sighted, as you obviously hadn't thought the situation through before you posted the above inflammatory remark.

The move I'm suggesting would keep a group of current Nvidia owners from upgrading to competitors hardware (stifling ATi a bit and removing Eyefinity as a reason you MUST HAVE their hardware), while also giving Nvidia card sales a boost. It would be fully positive for Nvidia in the short term.

What you've forgotten to weigh in is exactly how far out Nvidia's new cards really are. The farther out they get pushed, the more sales they loose to ATi. There's a tipping point where the losses due to people jumping to ATi hardware, overtakes the potential losses from enabling this feature. At that point, enabling SLI Mosaic mode on these older cards becomes a profitable long-term move ("profitable" in the sense that they would lose less money than if they sat on their hands and did nothing for the months leading up to the launch of their new cards).

After that outcome is projected, the decision becomes "do nothing and continue to lose sales" or "enable said feature and lose fewer sales".
 
Last edited:
Not taking sides but Lorien has a good point. Your idea is good if its based on the premise that Nvidia 'cares' about the customers that have already bought their card. Once a company sells a product they don't care anymore for that customer, they just become a sales statistic. Good analogy would be a chick you wanna bang and you wine and dine and forgot about the next morning.
They're are more interested in selling more of their retail cards because its already counted as a liability in their books.

Profitable is the wrong word to use for the reasons above. Incentive to stop the bleeding would be more appropriate.
 
"do nothing and continue to lose sales"
This is already happening. If prices were going to drop they would have dropped them just before the release of the 5870-5850 to steal their thunder and keep consumers from switching to ATI as you say. It didn't happen. We'll see what they do after the release of the HD5770 and 5750, they might still have a window of time to do it.

...or "enable said feature and lose fewer sales".

Not going to happen for cards of this generation. It is better to sell nothing than lose money selling products that don't make a profit. All they lose is marketshare which can be recoupped with aggressive pricing on the new products. I hope they do this but it is unlikely.
It is more likely this will be enabled for the next generation, giving people more incentive to upgrade and to catch up to ATI on the check box feature battle.
 
You're still glassing over the length of time between now, and when the new cards are out. If we don't see new cards till Q2 2010, Nvidia is going to really be hurting.

If the rumors are true and they're discontinuing the GTX260, 275, and 285, then they aren't losing money on their sales, they're in the hole until they sell them off. It's in Nvidia's and retailer's best interests to sell those cards off.

If you can increase their perceived value by enabling a feature and sell them off at a higher price, all the better. Now, once again, this is only if the new cards are so far out that they would suffer more if they were to do nothing.


As for prices not falling on existing Nvidia cards; either people are still buying them at a level that justifies the current prices, or retailers are getting seriously close to dropping prices in order to clear stock. The law of supply and demand has to kick in eventually.
 
Could you stop with your own "marketing" tactics to try and turn me in something I'm not ? You are truly a pathetic person, if you think that's the way to go...but you've insisted on it for quite a few posts...I certainly can't stop you from doing it, but that is mighty troll of you...

Lol! To have marketing tactics, I need something to market. Unlike you, I don't have to go to every thread and do "crowd control" for the sake of the public opinion about Nvidia. I'm a consumer, you appear more and more like a paid marketeer from Nvidia. And, I do insist that you tone it down a bit, because its destroying my reading pleasure.

You do in forums what you accuse Charlie of doing on semiaccurate.

So is GT300 coming out this year? anything from Nvidia on release?

The closest thing I could find was what I posted earlier in this thread:

“The first Fermi GPUs are expected to launch by year’s end,” stressed Mr. Alibrandi.
http://www.xbitlabs.com/news/video/...irst_Graphics_Cards_on_Track_for_Q4_2009.html

It doesn't say if this is the Tesla Fermi or the consumer version. The consumer version is rumored to be a bit different then the Tesla version.

What worries me, is that there is no official release date, not even an official release month with general availability for the new GPU's. I would prefer that they gave this at least, since it would be easier for consumers to see if to wait or not. A time perspective makes life easier.
 
No gf100 for Christmas. Oh well.
Looking Beyond Graphics
Analyst: Tom R. Halfhill

NVIDIA has already received the first sample silicon of a GPU based on the Fermi architecture. If the
project proceeds on schedule, the first Fermi GPUs could hit the market this year.
Some of these enhancements are unimportant for 3D graphics but were requested by NVIDIA’s existing
and prospective GPU-computing customers. Indeed, some features (such as ECC) would actually
reduce graphics performance. To avoid compromising the GPU’s central role as a graphics
coprocessor, Fermi has provisions for disabling or bypassing features that are irrelevant for graphics.
For instance, GeForce-branded GPUs for consumer PCs will omit ECC; Tesla- and Quadro-branded
GPUs for professional workstations and supercomputers will include it.
Although Fermi has additional new features exclusively for graphics, they are not the subject of this
paper.
For consumers, GPU computing can speed up media-intensive programs that run sluggishly on even
the latest multicore CPUs. To take just one example, digital video is wasting untold hours of time as
PCs grind through a lengthy transcoding task. CUDA-enabled transcoders like Elemental Technologies’
Badaboom can reduce the waiting period to minutes. And cleaning up the video frame-by-frame was
unthinkable until MotionDSP’s vReveal came along. The wide appeal of these programs will force
consumers to take a closer look at the specs of a new PC. Until now, users who disdained games could
be satisfied with a low-end graphics processor integrated with the system chipset. GPU computing
opens up entirely new possibilities, potentially making a discrete GPU a must-have feature.
 
I never have bought a video card at release so I have no idea how far in advance would Nvidia announce the actual release date?
 
I never have bought a video card at release so I have no idea how far in advance would Nvidia announce the actual release date?
Generally, companies don't announce release dates because it would kill sales (think, why would people buy one product when a brand new one is right around the corner?). However, this information still gets leaked ahead of time, through one source or another. I'd say people usually have an idea at least two months in advance, sometimes longer - it depends on how many rumors are circulating and how well they fit together.
 
They need to release a $199 5850 competitor. If they don't, they will not take the majority of the gamer/enthusiast market back, because that is the price range most people are willing or able to pay for a card that can play the latest games well on a 1080p LCD.

If they fail to do that, they will lose a significant battle in this GPU generation, and potentially a lot of marketshare because this is the time many people will be upgrading to DX11 hardware, now that Win7 is out and the first DX11 games will be released during this hardware generation, creating a big marketing push and an advertising bandwagon to get people to upgrade their graphics hardware that the graphics manufacturers will be more than happy to jump aboard.

If they launch GT300 but screw around with MSRPs and try to play cute games with high MSRPs, it WILL bite them in the butt, because this is not the type of economy in which you can go back to launching $450 GPUs and expect to sell more than a few dozen of them. Hint: that's not how a billion-dollar company makes its profits.

They also recognize that people that would have spent $299-349 for a GPU two years ago are only willing to look at cards up to around $199-249 today. It's just the reality of the economic conditions and people's perspective of how they should be spending their disposable income when jobs are not secure, many are unemployed, and people are learning once again that smart people save more than they spend.

Granted, many people just watch the news on TV and believe all the propaganda, but the reality is quite different and they will be in for a shock when the media can't whitewash the economic news anymore because the reality is too dire. Here's a good example of independent, a-political economic reporting: http://www.shadowstats.com/article/depression-special-report Wake up, save up, and invest wisely. Buying a $450 GPU is pretty wasteful, regardless. But it's even more foolish in this economic environment.
 
Man, this is the same debate with different video cards....

I remember it back to the Voodoo / RivaTNT chipset

Sigh

True enthusiasts will spend the money, regardless of economic conditions, and a $500 video card being considered "expensive" or a "waste" is relative. Is it wise? Well wisdom is relative too....

I am not holding my breath
 
Back
Top