Nvidia delays Fermi to March 2010, new chip speculation.

IMO Crysis, not the tuned down Warhead, is the best looking game out yet.
Some other great looking games, but not quite Crysis.
 
In the interest of staying on topic, I agree with what you're saying in regards to eyecandy and settings, but a Plain-Jane SLI'd GTX 260 can still handle most of what's out there, so unless there's some killer app comin' out in the next 3 months that we'll actually want to play, there's no need for Fermi today.

Well, the HW requirements for running SLI GTX260s are presumably greater than the requirements for running Fermi (assuming you ignore the time machine).

I know that this is a basically irrelevant point. 98% of us have motherboards with more than one PCI-E 16x slot and have the PSU to power a pair of 260s. But let's just say you've got size and power requirments. Maybe the Fermi will be able to do more with less, much like the 5xxx series.

But then again, if you had those sort of requirements, you probably already jumped ship.

Still, it would be nice to see some competition. Competition never hurts.
 
Well they did cope with GT200, which was a somewhat similarly huge and expensive chip up to the release of the 5800s, though that might be rather painful for the bottom line.
 
Ouch this is going to hurt nvidia. Especially with Alien vs Predator and now Battlefield Bad Company 2 coming out with no DX11 GPU from nvidia insight.
 
This has to be bad news for their partners too. Selling what little stock that still in the channel, wonder if layoffs are being talked about?
 
I don't think this is the last time we'll see a delayed and/or inferior part brought to market by either camp. Fortunately for ATI their small, scalable architecture has paid off for them. Nvidia seems to be sticking with large, complex designs that is suffering both design and manufacturing related delays. I appreciate Nvidia's ever expanding reach with Tesla's GP capabilities, but can they continue to grow all these great features into a single architecture in a timely manner? In the end are we waiting for and paying for hardware to support features that are not relevant to the gamer?
 
The second crisis known as Crysis Warhead is a testament to the poor coding in the first crisis aka. Crysis. I believe every article has the developers quoted saying something like, "we promise that you'll be able to run Crysis Warhead on normal PCs." j/k... You know what I mean.

There are plenty of games that look better than Crysis, I like COD MW2 better, but that's subjective isn't it? True True... No super armor suit to give my schlong "Maximum Power" or "Maximum Speed".... :D

Warhead ran slightly better because it ran with diminished details even if you selected the same settings.
 
I expect a hostile takeover of NV by Intel in the future. Major change is coming at the minimum as this market is (finally) a mature market. Add it all up, loss of chipset sales for both AMD/Intel, Ion being pointless with the Pinetrail/Fusion.. all that's left is what the Nvidia Viral team pushes, 3d Vision and Physx. Not many of us care about those proprietary techs, especially enough to stick with their cards based on 4 year old chips.
In 2010 (or '09) it doesn't make much sense to buy 2006 technology, NV is a marketing company at it's best.. AMD is an engineering company at it's best. Finally it's clear to -everyone- even those with green colored glasses, the truth of the matter. I was pretty much an NV diehard since 3dfx's death due to the drivers (that advantage is long gone).

I would have to say that your proposition is very interesting and could possibly happen if things don't work out for Nvidia. You have to remember though that they've been around for a while, have a pretty big group of investors and this is only one set of releases out of the many years they have been coming out with new cards. We have to wait for things to pan out with the whole lawsuit against Intel as well to see just how bad nVidia could get shafted. Worst case scenario I could see an Intel buyout of nVidia but only because Intel will want to suck the GPU knowledge out of them and extract whatever they can to help their Larrabee R&D department. Also, the elimination of a possible x86 contender wouldn't exactly be a bad thing for Intel as well. Interesting stuff happening for sure.
 
Has Nvidia been able to clock Fermi up to decent speeds with all 512 shaders enabled?

From what I've read over at XS / chiphell nvidia had a hard time clocking just 448 shaders past 1200Mhz.
 
NVDA is in trouble. :(

Nah, they're just going to end up making SoC products for portable phones and Apple/Google devices.

I think a lot of public opinion has turned against them with the arrogant claims and the cartoons. Intel and AMD declared war on them, pushing them out of their respective markets. TSMC is not putting out a lot of 40nm, and NV will have to fight for that with AMD.. who has a top to bottom lineup of chips in the works..
the expense they'll have to pay TSMC to produce Fermi in quantity and other costs associated with that chip is going to be outrageous compared to AMD's.. who already has boards in many of our hands.

For Nvidia, yes the sky is falling and I'd love to hear why it's not (other than we haven't witnessed these nearly unavoidable truths yet). They have no market that's secure besides Tegra/SoC, which is a brilliant setup, but what a simply sad end result for the once-mighty Nvidia.

Fermi has to be fast, cheap, and available in quantity to restore that company's luster. The chances of that are slim to none. Lots of people are going to lose their shirts on that stock. AMD is going to drink their milkshake. Lets stop kidding ourselves that this is the same situation as the 8800GTX launch, both AMD and Intel are pushing them out of main markets with Fusion/PineTrail and no chipset sales for either. They had a great partner with AMD, but their failure to secure a real partnership (due to arrogance?) with AMD so they wouldn't bother buying ATI failed.
 
I would have to say that your proposition is very interesting and could possibly happen if things don't work out for Nvidia. You have to remember though that they've been around for a while, have a pretty big group of investors and this is only one set of releases out of the many years they have been coming out with new cards. We have to wait for things to pan out with the whole lawsuit against Intel as well to see just how bad nVidia could get shafted. Worst case scenario I could see an Intel buyout of nVidia but only because Intel will want to suck the GPU knowledge out of them and extract whatever they can to help their Larrabee R&D department. Also, the elimination of a possible x86 contender wouldn't exactly be a bad thing for Intel as well. Interesting stuff happening for sure.

NV is no x86 contender. Non-transferable licenses, and Intel can make any license null and void at a moments notice (in theory even AMD's).

Has Nvidia been able to clock Fermi up to decent speeds with all 512 shaders enabled?

From what I've read over at XS / chiphell nvidia had a hard time clocking just 448 shaders past 1200Mhz.

A lot was riding on A3 clocks. According to my contacts, the effort failed.
 
Referring to the article, there is mention of the GF104 chip. A while back, during the Nvidia conference where the fake fermi was presented, I commented here at [H] that Nvidia needed to make a streamlined chip with raw power for gaming, rather than a jack of all trades scientific simulation chip. I'm wondering if the GF104 would be that chip. They would surely win back some gaming GPU market share that way.
 
The GF104 seems to be what gamers are waiting for. It did state Q2 for it though.
AMD will have eaten into nVidia's market share big time by then.
 
My apology if this is completely off-topic.

The NVIDIA reality is Video/Chipset/SoC and TSMC with software features being the better part recognised generally when doing comparison in it's market.

Once you are big enough, the fundamental will set in, generally there's little miracle and a lot of hard work. In this case, the manufacturing race and the ever increasing billion/s dollar investment needed for smaller process. If TSMC cannot generate enough quick profits they cannot go at the pace of Intel. If nVidia lost the chipset implies TSMC lost the business, which adds pressure to further investment outlook.

I sincerely believe on a macro level, the simplest thing that can revive NVIDIA immediately is x86 license or license to Intel DMI/QPI. This will immediately bring Chipset division and economic of scale back to life, naturally at the expense of Intel/AMD, so I am not sure whether either will agree. This will not solve the manufacturing issue immediately but perhaps generate confidence and investment in the entire supply chain.
 
Fermi will be a fine chip when it's released, I'm sure. With the exception of the FX series of video cards, Nvidia has had a pretty good record of video card production. This seems to be the first snag they've had in a long while, and they've always learned from their mistakes in the long run.

But personally I think what you'll start seeing is the industry starting to shift from desktop computers to smartphone desktop setups. Plug your phone into its power adapter, and suddenly it's the device powering your monitor, mouse, keyboard, and speakers. The idea is fairly rudimentary now, but give it a few years when smartphones have the power to sufficiently run aps, and it could be a pretty decent idea.

Nvidia at least has the technological know-how to devise a SoC gpu/cpu to power that sort of hardware.

Edit - Maybe not desktops per-say, but possibly replacing the netbook power level of devices.
 
Nvidia as a company will be fine but I wouldnt be surprised if they bow out of the desktop GPU market in the next year or so. Fermi, even if it has the performance (which seems doubtful) will be overly expensive with no clear cut way to scale the architecture down for the mid-end.

Given that a loss with FERMI is almost a given and the high R&D I dont see how they could possibly justify new projects to investors. Their SoC market will live on and I dont think TESLA is going to go anywhere (I think that WILL grow as Nvidia forecasts) but I just dont see Nvidia offering a GPU below the $1000 mark in 18 months.
 
You guys are pretty much right on.

Nvidia is pretty much out NOW. Ion 2 is dead before it is even released.
Fermi looks like a boatload of trouble, if Fermi turns to shit.. Telsa is screwed. That means nVidia has Tegra 2 left.

Great, there's no competition in -that- market.. let alone nothings stopping AMD/Intel from their involvement in the SoC space and the already numerous competitors.

I guess they can start a comic book series chronicling their decline.
Hahaha.. in all seriousness, they do have 1.6billion in the bank, they can weather a disastrous hit.. and they will.. but they might pull out of alot of more markets.
 
I expect a hostile takeover of NV by Intel in the future. Major change is coming at the minimum as this market is (finally) a mature market. Add it all up, loss of chipset sales for both AMD/Intel, Ion being pointless with the Pinetrail/Fusion.. all that's left is what the Nvidia Viral team pushes, 3d Vision and Physx. Not many of us care about those proprietary techs, especially enough to stick with their cards based on 4 year old chips

OK, so before I say anything else, I need to say that I am not a gamer, so I basically know diddly-squat :( :eek: about high-end graphics cards and technologies. However, ....

I do know a bit about photography, and I have the strong impression that Adobe seemed to be relying on nVidia's CUDA architecture for their overall push into GPU-based performance enhancements in Photoshop and other Creative Suite applications.

Even is this is all true, I can't imagine why Adobe would ever want to buy into even a weakened nVidia. Adobe has problems of their own, and they are 100% software in their DNA. ;)

x509
 
This part is either going to kick ass or suck. I so want to pick up 3 5870's but I know as soon as I do nVidia will start leaking benchmarks that will make me cry.
 
I believe Nvidia chipset division is there to provide base revenue and finance entry into other markets. A chipset division left only to support Tegra is perhaps very different from a chipset division meant to support the entire x86 markets.

You may not need an Add-on video card, but you definitely need a chipset for every desktop computer. I could understand the Intel DMI/QPI situation but can never comprehend why Nvidia voluntarily give up further chipset development in the AMD market unless nVidia has decided to change strategy. Remember, prior to this Nvidia chipsets are also being used in a lot of low-2-mid end AMD servers.

The other potential could it be that Bumpgate has finally impacted business in a real way? I still see a lot of notebooks shipped with nVidia parts so that should not be the case.
 
I believe Nvidia chipset division is there to provide base revenue and finance entry into other markets. A chipset division left only to support Tegra is perhaps very different from a chipset division meant to support the entire x86 markets.

You may not need an Add-on video card, but you definitely need a chipset for every desktop computer. I could understand the Intel DMI/QPI situation but can never comprehend why Nvidia voluntarily give up further chipset development in the AMD market unless nVidia has decided to change strategy. Remember, prior to this Nvidia chipsets are also being used in a lot of low-2-mid end AMD servers.

The other potential could it be that Bumpgate has finally impacted business in a real way? I still see a lot of notebooks shipped with nVidia parts so that should not be the case.

You know that nVidia is no longer making chipsets, right? nVidia hasn't made a good chipset since the nForce4 (my ASUS A8N-SLI Premium based PC is now my HTPC) so nobody really cares. Anyone who bought a nForce based Intel system lived to regret their purchase - thanks Bumpgate.

Tegra is a low margin/high volume part which nVidia is taking a loss on because the only shipping product using it isn't selling - Microsoft's Zune. All of the upcoming Android phones are using the Qualcomm Snapdragon SoC and Apple has internal resources to make SoCs for the iPhone/iTablet so they're frozen out of the two major media phones.

Now we have the disaster that is Fermi. 'nuff said?
 
Last edited:
You will see "all parts of visible and hidden" Nvidia in full glory (at least for a few days) :) in case an affordable x86 license is given to them now.
 
It seems to me there is more to the story than what has been revealed by nVidia or its partners (as little as has been).

I'm a firm believer in 'the proof is in the pudding' and, as such, will wait until the actual product is revealed before voicing any comments, unless substantiated by fact.
 
It seems to me there is more to the story than what has been revealed by nVidia or its partners (as little as has been).

I'm a firm believer in 'the proof is in the pudding' and, as such, will wait until the actual product is revealed before voicing any comments, unless substantiated by fact.

The 'fact' is that we don't know any facts, which is highly unusual for nVidia.

Since you want to eat the pudding, you'll need this...

rebel_kidsware_spoon290_image2.jpg
 
It's disappointing because it's kinda hard to see the forrest through the tree's sometimes. I agree things look bad but all we have here so far is speculation, rumor and unsubstantiated formulas being thrown together. Nvidia has been around long enough to understand their situation a lot better than the forum arm chair quarterbacks here. Am I wrong? Hardly.

I for one am willing to wait a few more months since my tri gtx260's are doing more than fine. Oh and please don't get me started on dx11 nonsense. Rofl I've seen all the screenshot comparsions, don't make me laugh. Bc2 avp and the other few dx11 wonderful titles haven't had enough dev time to make a difference yet, we know this from past years when a new dx version is released.

Anyway the sky might be falling or it might not be but whatever happens I'll wait till before I run and buy amd's latest and greatest. Does this make me a fanboy? Some are definitely thinking that at they read this, guaranteed. To me I call it being a prudent patient consumer. Simple.

Happy newyear guys.
 
I waited after the 8800GTX was released for the R600. I ended up going SLI instead of CF and gained mature drivers and lower prices for my wait.

So it's not like it hasn't happened before.
 
I wonder, after Intel decided to not release a consumer version of Larrabie, and along with nVidia's Fermi problems (but sitting on tons of IP), is the possibility of Intel buying out nVidia increasing? .... hmm.... :confused:

I can see AMD/ATi vs. Intel/nVidia in the future. wooOOOoooo!!!
 
I remember the rumors that nVidia would buy AMD awhile back. Funny the stuff people come up with at times.
 
Hah! True, but Intel buying AMD would bring the wrath of the feds due to monopoly laws. Intel just found out how hard it is to make a GPU and nVidia may look a little tasty.
 
Hah! True, but Intel buying AMD would bring the wrath of the feds due to monopoly laws. Intel just found out how hard it is to make a GPU and nVidia may look a little tasty.

yeah... the possibilities are boundless - especially with nVidia's GPGPU push.
 
Not saying it couldn't happen, but I wouldn't like how that might turn out. :eek:
 
Not saying it couldn't happen, but I wouldn't like how that might turn out. :eek:

AMD/ATi would be forced into a wierd market position - since Intel's fab always seems to be ahead of the bunch.

However... we don't know how good it is at making actual GPU chips.
 
Well, my guess is about as good as AMD's was before buying ATi? :D


LoL, you meant the Fab not Intel...
 
This part is either going to kick ass or suck. I so want to pick up 3 5870's but I know as soon as I do nVidia will start leaking benchmarks that will make me cry.

By the time you wait for Fermi, ATI will release refreshed 5800/5900 cards, and then two months later announce the 6800 series. I'm sure you know in the tech world, waiting for anything means waiting for nothing. There will always be new tech that is faster right after you're done waiting for the slower part. Get the 5870s then wait to see what the 6800 series has to offer.

Fermi is sandwiched between two ATI cycles that will defeat it in short time. Fermi needs to be delayed until Q3 2010 to compete with the 6800 series.
 
Back
Top