GF100 (Fermi) explained

I'm not an expert so in short how will all of that affect the gaming performance? any way to explain it in short? Are they coming up with something very powerful on both gaming and cpu side ? thanks!
 
phprtyw7c07dgl6.jpg
 
I'm not an expert so in short how will all of that affect the gaming performance? any way to explain it in short? Are they coming up with something very powerful on both gaming and cpu side ? thanks!

Real time ray tracing. Weather this card is powerful enough or not remains to be seen but down the road this style arch will allow for it.
 
Yes its smaller didn't they said even officialy the card is huge? i doubt thats GT100
 
or... then some, and then a GPU

That's what worries me a bit. All the tech they are talking is awesome for our researchers here at work. I'm sure the Tesla versions of these cards will be purchased here in good numbers.

However I don't care about that. What I want to know is how it does for games. I am worried that a lot of that power will not be useful to making fast, shiny pixels in games.

I'm rather the nVidia fanboy, but I'm debating pulling the trigger on a 5870. I don't need one, but I want a shiny new toy and I'm not so sure nVidia's shiny new toy will be worth waiting for.
 
Then just get a HD5870. You can always sell it off if nVidia delivers on gaming performance once the cards are released.
 
The fanboys were starting to sweat a little, at least now they have some ammunition and now people are going to pretend they understand half of what has been released and how it will affect the real world and whole lot of people will parrot these musings as gospel.

Let the flame wars commence. :rolleyes:
 
Tesla C1060 got 3.52 FPS on their double precision simulation
Fermi got 18 fps on same thing.

There was mentioned that it will have 4.8GHz GDDR5 memory onboard - 230.4GB/s
 
The fanboys were starting to sweat a little, at least now they have some ammunition and now people are going to pretend they understand half of what has been released and how it will affect the real world and whole lot of people will parrot these musings as gospel.

Let the flame wars commence. :rolleyes:

All future video section Threads/
 
Then just get a HD5870. You can always sell it off if nVidia delivers on gaming performance once the cards are released.

Ya I'm considering it. I'm going to wait a bit longer for the drivers to shake out a bit at least, but I'm thinking of getting one. I'm also mulling over waiting for the 5870x2 though that is well and truly overkill, not to mention more heat output than I want.

For the moment I'm not doing anything, but I'm leaning towards getting a 5870 here. I mean I'm sure that the new nVidia card will perform just fine for games. That's not the issue, my 280 performs just fine. The question is performance per watt more or less, and the 5870 is killer for that. I'm not so sure the GF100 will be so good. It may wate a bunch of power on things that I don't use.
 
Yawn. I had higher hopes they'd have something more compelling other than this useless niche geek toy gimmick of gpgpu at the high end. What is worse is that they didn't even try to make a dx11 chip for mainstream and OEMs. This is admitting they knew they were going to get dumped by OEMs after bumpgate and saw no incentive in developing a directx11 line for that market. And guess what? You will have to pay for the cost of developing this thing. Suggested retail price will be $549 - $599.

The problem is Nvidia has no other choice but go this route. They do not have x86 tech at all, AMD and Intel do and they would've simply choked them to death when the time came to integrate cpu's and gpu's together if they simply kept accelerating graphics. They are painting themselves into a corner as an exit strategy from the pure graphics accelerator market when it is supplanted by cpu/gpu hybrids.
 
The fanboys were starting to sweat a little, at least now they have some ammunition and now people are going to pretend they understand half of what has been released and how it will affect the real world and whole lot of people will parrot these musings as gospel.

Let the flame wars commence. :rolleyes:

You mean the Ati fanboys are starting to sweat a little? The 5870 isn't even the fastest card out, and this card looks like it's going to eat it for breakfast. Maybe not, but with those specs......wow....

Then again I might just have one of each card.....eyefinity is really really amazing.
 
You mean the Ati fanboys are starting to sweat a little? The 5870 isn't even the fastest card out, and this card looks like it's going to eat it for breakfast. Maybe not, but with those specs......wow....

Then again I might just have one of each card.....eyefinity is really really amazing.

Lol here they come out of hiding. Preach it man
 
Yawn. I had higher hopes they'd have something more compelling other than this useless niche geek toy gimmick of gpgpu at the high end. What is worse is that they didn't even try to make a dx11 chip for mainstream and OEMs. This is admitting they knew they were going to get dumped by OEMs after bumpgate and saw no incentive in developing a directx11 line for that market. And guess what? You will have to pay for the cost of developing this thing. Suggested retail price will be $549 - $599.

The problem is Nvidia has no other choice but go this route. They do not have x86 tech at all, AMD and Intel do and they would've simply choked them to death when the time came to integrate cpu's and gpu's together if they simply kept accelerating graphics. They are painting themselves into a corner as an exit strategy from the pure graphics accelerator market when it is supplanted by cpu/gpu hybrids.

First off, when I think of the word troll, a post like this comes to mind. Second, I have a feeling that if nvidia can sufficiently broaden the market for these as they're trying to do with CUDA (appealing to engineering and medical applications) that MSRP doomsday scenario you're predicting will be mitigated. Those dedicated parts with 6gb are going to be the expensive parts. Nvidia knows that if they bring a $600 dollar part for general consumption in this kind of economy the results would be uncompelling. For real, lay off the caffeine, set aside the tinfoil hat. This is just another product release, not an M. Night Shyamalan film.
 
First off, yawn again. So someone is a troll for pointing out scenarios based on historical facts. If you somehow feel offended by the truth I have a set of blinders and a rickshaw you can pull to make you feel better.
Second, you are agreeing with me. They are developing an exit strategy from the pure graphics accelerator market into the niche that is gpgpu today in the hopes that this is where the industry will head. In the end this is meaningless for consumers and specially for OEMs who are the bread and butter of companies like Nvidia.

You have a "feeling" that "if" nvidia can "sufficiently" broaden the market.

The words feeling, if and sufficiently, are nothing more than conditional statements and wishful thinking. Now THAT I know I've heard a lot of in M. Night Shyamalan films.
 
You mean the Ati fanboys are starting to sweat a little? The 5870 isn't even the fastest card out, and this card looks like it's going to eat it for breakfast. Maybe not, but with those specs......wow....

Then again I might just have one of each card.....eyefinity is really really amazing.

we can hope but what Brent said is what is making us worry, a jack of all trades is rarely a master at any one. even if GPU computing comes into its own it isn't this year (2010). Now if they are smart they will use all that horsepower to at least provide more feature then the 5800 and leave the FPS race to ATI. I know a playable version of Nvidia AO would be a major selling point for me.
 
Last edited:
First off, yawn again. So someone is a troll for pointing out scenarios based on historical facts. If you somehow feel offended by the truth I have a set of blinders and a rickshaw you can pull to make you feel better.
Second, you are agreeing with me. They are developing an exit strategy from the pure graphics accelerator market into the niche that is gpgpu today in the hopes that this is where the industry will head. In the end this is meaningless for consumers and specially for OEMs who are the bread and butter of companies like Nvidia.

You have a "feeling" that "if" nvidia can "sufficiently" broaden the market.

The words feeling, if and sufficiently, are nothing more than conditional statements and wishful thinking. Now THAT I know I've heard a lot of in M. Night Shyamalan films.

Historical? When was the last time something that drastic happened in the graphics business? And also, think motive. Why would nvidia want to exit when they're doing fine/excelling? It's far more likely, as other have already pointed out, that this is a competitive move to head off Intel's Larrabee. Expanding your product focus does NOT necessarily mean you are abandoning a core market, contrary to whatever your inner fanboy tells you.
 
Matrox has done just fine.


Nvidia is right. Xbox 360 and ps3 are the biggest influence. save for crysis, a 8800gtx level card is jsut fine for the vast majority of pc users and it also renders directx 11 moot.
 
Historical? When was the last time something that drastic happened in the graphics business?

I was talking about bumpgate and them not developing mainstream and OEM dx11 parts after that debacle knowing they'd be kicked out by OEMs for it. But you are right, nothing this drastic has happened.

And also, think motive.

I would like to borrow your wonderful psychic powers that allow you to know a company's motives.

Why would nvidia want to exit when they're doing fine/excelling? It's far more likely, as other have already pointed out, that this is a competitive move to head off Intel's Larrabee.
They are not heading off anything and what they want is irrelevant. It is a reactionary move based on what the future market will dictate. Intel is entering the market with a hybrid gpgpu design leveraging their prowess with x86 tech. AMD is doing the same with Fusion tho what form will that take is still up for debate. They are in fact admitting that this will be the market.

Expanding your product focus does NOT necessarily mean you are abandoning a core market, contrary to whatever your inner fanboy tells you.
Expanding products does not mean having only software support for tesselation ignoring what directx11 calls for. Go read the white paper please. Let me guess it all went over your head, right? Bye now.
 
Last edited:
When the card is launched, we will see if they heard opportunity knocking and produced a product there is a market for. :)
 
I was talking about bumpgate and them not developing mainstream and OEM dx11 parts after that debacle knowing they'd be kicked out by OEMs for it. But you are right, nothing this drastic has happened.



I would like to borrow your wonderful psychic powers that allow you to know a company's motives.


They are not heading off anything and what they want is irrelevant. It is a reactionary move based on what the future market will dictate. Intel is entering the market with a hybrid gpgpu design leveraging their prowess with x86 tech. AMD is doing the same with Fusion tho what form will that take is still up for debate. They are in fact admitting that this will be the market.


Expanding products does not mean having only software support for tesselation ignoring what directx11 calls for. Go read the white paper please. Let me guess it all went over your head, right? Bye now.

you don't live in the real world. Directx11 is 100% meaningless, less than meaning less. this is not 1998 or 2002 it's 2009. all game development is driven by xbox360 and ps3, period. no one is going to build a directx11 game from the ground up. it will fail just as direct10 failed. sure you might have games that might "tack" on some directx11 features after the real version is done, the xbox360 version ala dirt 2.

This is a very very smart move by nvidia.
It's amazing how much damage Microsoft has done to the PC gaming space with that money sieve the xbox360.
 
I would like to borrow your wonderful psychic powers that allow you to know a company's motives..

It's really nothing special, just an educated guess. I'm sure they don't want to exit a market that has proven to yield profit. Pretty simple cornerstone of business right?

They are not heading off anything and what they want is irrelevant. It is a reactionary move based on what the future market will dictate. Intel is entering the market with a hybrid gpgpu design leveraging their prowess with x86 tech. AMD is doing the same with Fusion tho what form will that take is still up for debate. They are in fact admitting that this will be the market.

Reactionary measures are taken in response to a stimulus. Actions taken with regard to future developments are predictive. They are trying to go in a direction they think will be relevent. Let us please not forget these are professionals. They are doing this based on what they know, and what they have foreseen will be beneficial to their business.

Expanding products does not mean having only software support for tesselation ignoring what directx11 calls for.

Not expanding products, expanding their scope of application -- meaning that nvidia are working in closer conjunction with other industries and attempting to suit their needs. DX11 and tesselation have nothing to do with it. These cards will NOT be JUST FOR GAMING. You implied that by not specifically engineering this chip for gaming and only gaming, Nvidia is moving away from the GPU market, hoping the market will follow toward GPGPUs -- the obvious corollary being that it is a premature and possibly foolish move.

You then proceeded to list examples of industry rivals (and potential rivals, ie Intel) following suite, thereby confirming the efficacy of nvidia's gambit. What exactly are you trying to assert? Do you even have a point? I'm honestly making an effort to consider your argument but I'm simply not sure what it is.

Go read the white paper please. Let me guess it all went over your head, right? Bye now.

This made me chuckle when I read it on my phone at Sheetz. If I promise to act offended will you feel less like you wasted time typing that immature jab? Seriously though, that kind of nonsense isn't conducive to discussion and doesn't really paint you in a flattering light.
 
Last edited:
Why does one thing exlude the other? Seeing the size of the Fermi, who's to say it can't do both? Be a good GPU for games as well as bringing some extra GPGPU capabilities? We've only seen the beginning of what it can do. This time it was focus on GPGPU. Next time it might be on the gaming capabilies. People are too quick to make conclutions IMO.

Its a shame that its not out before q1 2010. By then we might see something more concrete about AMD's Bulldozer and Intels Larrabee. I have a feeling that some might do the waiting game a bit longer then expected (wait for what Fermi offers, wait for what Larrabee offers before buying Fermi, wait for what Bulldozer offers before buying Fermi/Larrabee).
 
you don't live in the real world. Directx11 is 100% meaningless, less than meaning less. this is not 1998 or 2002 it's 2009. all game development is driven by xbox360 and ps3, period. no one is going to build a directx11 game from the ground up. it will fail just as direct10 failed. sure you might have games that might "tack" on some directx11 features after the real version is done, the xbox360 version ala dirt 2.

This is a very very smart move by nvidia.
It's amazing how much damage Microsoft has done to the PC gaming space with that money sieve the xbox360.

I am going to disagree with you there. DX11 does actually bring something to the table that justifies it. (let me agree with you on 10, 10.1 could have if everyone went with it) I agree that a lot of companies are going to focus on the larger game markets but trends come and go and console platforms do not stay stagnant. tessellation alone (I think some already use it?) make it a candidate for the next Xbox. also Direct compute, OpenCL and the still viable PC game market are going to be pushing this. probably not this coming year but with lambree and other factors I think it is coming. OpenCL is awesome but if it ends up like OpenGL MS will be setting the standard I think.

Also I think right now we are looking at a case where PC gaming is going to distinguish itself from console gaming, hell look at eyefinity. there is no console that can do that. and Nivia's up and coming may well be able to do all kinds of special effects (I am hoping for working AO driver override) to say that PC gaming is dead is short sighted I think.

JM2C
 
you don't live in the real world. Directx11 is 100% meaningless, less than meaning less. this is not 1998 or 2002 it's 2009. all game development is driven by xbox360 and ps3, period. no one is going to build a directx11 game from the ground up. it will fail just as direct10 failed. sure you might have games that might "tack" on some directx11 features after the real version is done, the xbox360 version ala dirt 2.

This is a very very smart move by nvidia.
It's amazing how much damage Microsoft has done to the PC gaming space with that money sieve the xbox360.

I agree with your last sentance. PC gaming is waaaayyyy better than the 360 IMO. I tried the 360 and the PS3.....the wow factor just wasn't there, save for the initail unpackaging and setting up (just like any other new toy).
 
So much hype and no release date. NV30 repeat.

This was more of a, "hey, look at these monster specs and this card which may, or may not have new hardware underneath the plastic". All done to steal some of Ati's thunder, and put doubt in the minds of people who have been waiting a loooong time for new hardware, and like what the 5870 has to offer. Shit, I get on Ati a lot, but the 5870 is a damn fine card. I really like the eyefinity.

I guess we will just have to see what, if any changes Nvidias card brings to PC gaming.

No matter what camp you're in, you have to hope that new stuff like this will lead to better all around gaming.
 
Back
Top