NVIDIA Fermi White Paper

I have to admit I was really surprised when I was watching the streaming. I thought it was going to be more of we got more FPS in a game than you ATI, Muahahaha! Then show a clip of them saying F yo couch!

The technology they were showing was cool, but a little boring for my taste. I see that what they are trying to do is get into more markets and possibly more devices in the future. Which really makes sense for Nvidia. Amd can make CPU's, chipsets and GPU's. Intel can make CPU's and really crappy GPU's, and semi decent motherboards. Nvidia can really only make GPU's so why not make a technology that makes their GPU do more and get them into other areas of technology. They probably realized they can't just sell Geforce cards now of days and live off of that. Unless if of course ATI cards had no chance in competing with them.

In my mind it makes sense what Nvidia did and I hope their card does decently in games which I believe it will. Then Amd's cards will go down and then I can get my tri-display!!!

Also I have to sell my PS3, snowboard, and left nut. Totally worth it though.

*also I realize I was a little blunt on what each company can make. Ok really just kind of wrong, but you know what I mean :)
 
I have to admit I was really surprised when I was watching the streaming. I thought it was going to be more of we got more FPS in a game than you ATI, Muahahaha! Then show a clip of them saying F yo couch!

The technology they were showing was cool, but a little boring for my taste. I see that what they are trying to do is get into more markets and possibly more devices in the future. Which really makes sense for Nvidia. Amd can make CPU's, chipsets and GPU's. Intel can make CPU's and really crappy GPU's, and semi decent motherboards. Nvidia can really only make GPU's so why not make a technology that makes their GPU do more and get them into other areas of technology. They probably realized they can't just sell Geforce cards now of days and live off of that. Unless if of course ATI cards had no chance in competing with them.

In my mind it makes sense what Nvidia did and I hope their card does decently in games which I believe it will. Then Amd's cards will go down and then I can get my tri-display!!!

Also I have to sell my PS3, snowboard, and left nut. Totally worth it though.

*also I realize I was a little blunt on what each company can make. Ok really just kind of wrong, but you know what I mean :)

Pretty sure there will be another conference held down the road for the gaming segment.. this one was purely meant to focus on GPU "computing."
 
But it does emphasize their current focus now, which has changed as per their whitepaper

For sixteen years, NVIDIA has dedicated itself to building the world’s fastest graphics
processors. While G80 was a pioneering architecture in GPU computing, and GT200 a major
refinement, their designs were nevertheless deeply rooted in the world of graphics. The Fermi
architecture represents a new direction for NVIDIA
. Far from being merely the successor to
GT200, Fermi is the outcome of a radical rethinking of the role, purpose, and capability of the
GPU.
 
Pretty sure there will be another conference held down the road for the gaming segment.. this one was purely meant to focus on GPU "computing."

There is also a 15-page thread now going in For Sale/Trade for his left nut :p

It really does look like a big departure from current GPU architecture. I'm not sure if this means more FPS for my Crysis sessions though...
 
So Nvidia is improving CUDA while ATI released Eyefinity.

Sorry Nvidia, but Eyefinity made me jizz my pants while CUDA makes me go "meh".
 
The graphics processing unit (GPU), first invented by NVIDIA in 1999

what what? 3dfx who? matrox who? i love it when nvidia is talking out their ass. the only reason we have this white paper crap is because they're getting slapped by the 5800 cards and everybody knows it. i'll be impressed when its on the shelf for a reasonable price.
 
what what? 3dfx who? matrox who? i love it when nvidia is talking out their ass. the only reason we have this white paper crap is because they're getting slapped by the 5800 cards and everybody knows it. i'll be impressed when its on the shelf for a reasonable price.

well thats because nvida purchased 3dfx.
 
But it does emphasize their current focus now, which has changed as per their whitepaper

One thing that scares me is that so far they aren't really addressing what Fermi can do as a GPU in games. If all this computational power doesn't perform well in games your average gamer and enthusiast isn't going to care what the hell it can do with CUDA or anything else.
 
The claims of full-speed double point mean that half of the multiplier transistors are going to be doing nothing to improve graphics and gaming. If they are remotely serious about IEEE-754 (rounding, exceptions, underflow, etc.) its going to be a lot worse (for gaming).

Does anybody know if IEEE-754 accepts fused mult-adds yet? Last I heard, you had to get the full 130-some bit product. Round the product. Add the numbers, then round the summation. Finding the "more accurate" fused number wasn't allowed. It's pretty common to do, but it broke the spec (last I knew much about it).
 
Could someone please explain this? Teh GPU seems to ahve both L1 and L2 cache like a CPU an has 64 bit everything. It has a decently but odd sized memory bus like the 8800 series.... Cuda means nothing to me and Physx is nice but proprietary and hurts the industry. All I wanted to know if the GT300 was fast. as fast as the 5800 series or faster. I know nvidia is having manufacturing problems. I guess I can only wait for their product, review, and a few weeks for price stabilization before I buy. When is the damn thing coming out late this year? The White paper overloads me with information that makes little sense to me. And gives Gamers nothing! :eek:
 
what what? 3dfx who? matrox who? i love it when nvidia is talking out their ass.

The "GPU" term was coined when they launched the Geforce in 1999. Was probably the biggest addition the Geforce offered, as it allowed forum goers of the future to no longer have to type out "graphics chipset" or "3d accelerator chipset" over and over :p
 
Being an owner of an 8800gtx few months after release and the owner of 4870 upon release, I share no favortism for either company. Only better performing products at the correct price points gets my $. Maybe it's just me, but it seems like NVIDIA is saying "fuck the average consumer and we're focusing our efforts on big corporate business". Judging from the specs, this card should haul ass, but only if they optimized it for computer graphics instead of CUDA. Even if this card edges out ATI 5870, by no means will it win the consumer if the gains aren't large enough, because you can sure as hell bet that price will be much higher. On that note, given the Q1 2010 availability time frame, ATI would already have time to counteract with a possible 5890, 5870x2 (Nov. 09 release), price drop or all of the above.

Seems to be a great time to be a consumer as we can expect prices to fall for both camps:D
 
As long as the competition between ATI and NV continues and we keep getting better products I'm happy and have no favorites. However I can understand why NV would want to focus on other market segments. Consumer graphics has reached the point where it's a low margin business based on volume, the returns are slowly diminishing as more competitors like Intel want a piece of the pie, and frankly R&D costs keep increasing every cycle. So what if we can run a game at 2560x1980 at 200fps with 8XAA etc? Next year someone's gonna beat that and the cost in R&D to make the next gen card will be a lot more for a lot less relative gain.
By branching into the scientific and business compute markets NV is not only hedging their bets but making an effort to grow their business. As someone has already mentioned, AMD and Intel both have other businesses they can expand into whereas NV for the most part is still a single segment business.

So even if this chip isn't 200% faster on games, if it can make up for the loss of some fanboys by growing academic/corporate sales then it will have been worth it. I predict that GPGPU and cGPU will be an enormously profitable business for NV, the demand for very high performance computing scales up with the drop in costs, a lot of scientific and engineering problems are now at least possible to solve with inexpensive hardware, this makes it(HPC) an attractive investment to make for many organizations where it was priced out of range in the past.
 
I'm sure it will be fine in the gaming department. While it's been aparent for a while now NV is going more GPGPU comptuing they aren't stupid by a long shot I'm sure it will be fine for what the bread and butter is, which is gamers(for now at least lol). Have a little faith and patientence for fucks sake guys. Geesz
 
Pop an SDD on it, teach it a few more languages, and give it a power supply. I want a GT300 COMPUTER!


(don't take this seriously, it's all for fun... also, it's sad I have to specify that lol)
 
The "GPU" term was coined when they launched the Geforce in 1999. Was probably the biggest addition the Geforce offered, as it allowed forum goers of the future to no longer have to type out "graphics chipset" or "3d accelerator chipset" over and over :p

like i said, i love it when they talk out their ass :p
 
How do you connect a Monitor DVI Cable to this "Release". Shocks in AWE...

Where's the Photo with Jen Hsu HUANG eating his can of WhoppArse when you need it?
 
This isn't a launch, it's an announcement about an upcoming product.

but it's a weak show after AMD hard launched a completely kick ass new line of GPUs. If thie announcement had come sometime before the 23rd, it wouldn't seem quite so pathetic.
 
Also interesting how "fermi" sounds like "fermer," the French verb "close," as in CLOSED ARCHITECTURE. muahahahaaa
 
So Nvidia is improving CUDA while ATI released Eyefinity.

Sorry Nvidia, but Eyefinity made me jizz my pants while CUDA makes me go "meh".

Well Fermi looks awesome for folding... But will it play Crysis at 60 FPS. :D
 
Well the way I see it, the computing power for various tasks is potentially VERY impressive. With that said we are talking about buying video cards so we can play games on them. The way I see it NVIDIA has thier work cut out for them. They need to address some things in order for their card to really compete against AMD's offerings. AMD seems to have been concentrating on improving the actual gaming experience. Eyefinity brings multi-monitor gaming to the table and while that's something I don't suspect everyone will want to take advantage of, I think it is compelling. This is especially true given today's monitor prices.

NVIDIA needs to accomplish two things with Fermi:

1.) NVIDIA needs to be able best AMD's 5800 series in game performance or at least match it.
2.) NVIDIA needs to either match Eyefinity with a similar feature, or provide a more attractive feature that would make people choose thier products sacrificing Eyefinity.

Where the rubber meets the road, (IE the consumer that buys graphics cards for playing games) game performance and the gaming experience are all that really matter in a consumer level graphics card. Hopefully NVIDIA hasn't lost sight of that.
 
One thing that scares me is that so far they aren't really addressing what Fermi can do as a GPU in games. If all this computational power doesn't perform well in games your average gamer and enthusiast isn't going to care what the hell it can do with CUDA or anything else.

there is some hope here, a really powerful cpu can do some decent gaming using software rendering, its to be hoped that fermi can do some pretty neat effect in software as well as hardware. this could in theory make for some very flexible coding options,esp with all the enhancements they have done. if this works like advertised you could do a LOT with software based rendering combined with hardware based rendering. http://www.hardocp.com/image.html?i...kV0Z1UW00bWNtVm1QVmxZU2pCaFYwNXpXbEU5UFE9PQ==

don't get me wrong, I am going out on a limb and speculating that ATI is going to win this round, GPU computing just isn't that important yet. It just the more I look at it the more viable it looks to me. I Just hope they have a supped up GXT200 that they can do a die shrink on to make a real gaming product.
 
You guys laughing at Nvidia for saying they created the first GPU.... they did. The original GeForce DDR was the first card (consumer level anyway) that could do Transform & Lighting in hardware.
 
It's builded into the GPU architecture...*hint-hint*
anandtech suggested it would be a tesla specific feature, and nVidia are not exactly unknown for removing functionality from geforce drivers that are present in quadro equivalents.
 
Hmmm... I think the gaming side of 'fermi' is probably going to kick a lot of ass. Though price will be the big speculation. (judging by all those transistors and FMA's and junk).

I like the direction they are going, Nvidia is not aiming at the GPU market, its aiming at Intels market. If the GPU is going to survive it needs to beat the pants of off CPU's interms of 3D stuff. Intel threw down the glove with its 6-core doing real time ray-tracing in a game demo.. (Though they have been telling everyone about it for awhile). I bet when we hit 12 core-CPUs, there will be a huge struggle between GPU and CPU for gaming dominance.

Annnd the Eyeinifinity stuff is great and all but for those of us that already own large displays (like me) I say meh.. Unless AMD will buy me two more 42' displays if I bought a 5870... I do NOT care about multiple displays..

I'll wait and see, and get whatever is better.
 
You guys laughing at Nvidia for saying they created the first GPU.... they did. The original GeForce DDR was the first card (consumer level anyway) that could do Transform & Lighting in hardware.
Yes, they did the coin the marketing term GPU when they launched the Geforce brand.

However, if you look break down the Acronym "GPU," you get
Graphics Processing Unit.

Given how ambiguous each of those words are, most people believe that a GPU could literally be any sort of unit that processes graphics, meaning any generic graphics adapter. I believe that the definition should include any processor that offloaded the graphics processing from the CPU.

Nvidia can claim GPU all they want, but its all about how the industry defines it. Its just like Psion claiming the word "Netbook" when the word has already entered the English Lexicon under a different definition. Just because GPU means something specific, it won't change people's use of it.
 
NVIDIA needs to accomplish two things with Fermi:

1.) NVIDIA needs to be able best AMD's 5800 series in game performance or at least match it.
2.) NVIDIA needs to either match Eyefinity with a similar feature, or provide a more attractive feature that would make people choose thier products sacrificing Eyefinity.

Agree with #1. Given the angst over Fermi the fallout if it falls short on gaming performance would be epic. Eyefinity is meh as a competitive feature to worry about. Until the HUDs, menus and OSDs in games are multi-screen compatible it has very little practical use for gaming.
 
I think we are at a juncture and Nvidia is moving in the correct direction but is possibly a little early as regards the games market.

But of course Nvidia is not just fighting a graphics battle with ATi but also a battle against Intel in the GPU>CPU war and against Larrabee.

I think it is the war with Intel which has spurred this architecture as Nvidia tries to steal ground from Intel in the HPC market and beat Larrabee before it's even launched....they have certainly landed a knockout first punch.

While Fermi may also beat ATi in terms of outright performance graphically it will never match it with this architecture in the performance/Dollar sector.....at least in the near future....but as games developers start to use the computational power of the gpu for physic, AI, raytracing etc all of which is going to become more common place with DX11then Fermi may well gain ground in this area too as it already makes the ATI 5000 series look like last years news in computational design....like I said we really are at a juncture.
 
Back
Top