Fermi is going to change gaming forever

Don't forget. 2011 consoles mean some form of dx 11 in these consoles (Even sony's becuase they will likely go with amd or nvidia (perhaps power vr) but all of them develop for pcs and have dx 11 capable hardware or will shortly.

The PS3 uses a 7800GTX (basically) for its GPU - a DX9 part. And yet, it doesn't have DirectX at all. Sony's next console also won't have DirectX at all. DirectX is proprietary to Microsoft. It doesn't matter if the card supports it, the software doesn't. Sony's next console will continue to be OpenGL (and for what its worth, all of the DX11 features are available in OpenGL as well)

You know what people miss is that fermi WILL bring something to the table.

That's the thing, Fermi *isn't* bringing anything new to the gaming table.

We don't know that right now but really people seem to think the 5870 is so revolutionary?

First with DX11, first with Eyefinity, and its very fast with very low power use - yes, the 5870 is a damn good card.

Question I have to ask is how many people have the kind of money to lay down on the 5870 and it's "revolutionary" co-products, aka three screens.Lets say I go buy three decent 24" monitors...that's what $450ish x3 or $1350. Well I can go to dell and buy two computers for that cost, and not really crappy ones at that. So maybe 0.1% of users are going to run this technology. So lets presume that 2 million GPUs have Eyefinity, then that means roughly 20 000 people are going to use multi-monitor. The only thing that is going to change in the near future is that companies are going to push cheaper and crappier monitors. TN panels are bad enough as it is and I'd stick with 22-24" IPS screens. So what does the 5870 really bring to the table...1600 shaders? The only thing I see AMD revolutionizing is ultra killer super duper parallel processing. Which is precisely what Fermi is designed for, so this makes me question something why does someone need that many shaders? You know I would buy a 5870...but it's totally not worth it, the benefits I get...none really. I'll wait and see what happens but I might buy another nvidia card or if the prices become low enough I might buy an ATI card (I doubt it, their drivers piss me off **hint hint Linux power user). Anyways whatever happens is whatever happens and I doubt we can really change it by discussing it.

And how many people will buy *TWO* Fermi cards *AND* have an SLI capable motherboard to run 3 monitors with nFinity? Even fewer than Eyefinity, which also works with the entire 5xxx series and with a single card. You're also ignoring that the 5xxx series brought DX11 to the table, and brought it to an entire range of *affordable* cards. If you think the 5870 didn't bring anything to the table you've basically never read a single review of the card.
 
First with DX11, first with Eyefinity, and its very fast with very low power use - yes, the 5870 is a damn good card.

And how many people will buy *TWO* Fermi cards *AND* have an SLI capable motherboard to run 3 monitors with nFinity? Even fewer than Eyefinity, which also works with the entire 5xxx series and with a single card.

If you think the 5870 didn't bring anything to the table you've basically never read a single review of the card.

First with DX11 sure, but there sure aren't many games yet that the couple-month span benefited you for having it so early (I say this as a 5870 owner myself). EyeFinity being "first", ever heard of the Matrox Parhelia or TripleHead2Go? This is nothing new whatsoever, it's just being marketed as such... and power use doesn't even factor into most people's decisions: ask what someone's video card uses for wattage and they'll likely stumble out "some?" even if they're an enthusiast-level user ;).

5870 really didn't bring much but extremely high performance and first to the gate with DX11 (what supports it, Dirt 2 and BattleForge today? not going to change much in two months' time anywho)... and, if Fermi's tesselation turns out to be so amazing as they claim, clearly not enough high performance where it matters ;).

Personally I want to see Fermi kick @$$ so that we have some overall price drops... I plan on switching from the 5870 to it unless it's a massive flop, myself, as I dislike the myriad driver glitches I encounter with the 5870 that I've never had on recent nVidia cards. Personal preference, sure... but that's really just an aside: I want more performance too :p.

As far as needing SLI for triple-head gaming, even on a 5870 or single Fermi (probably) you won't have good performance @ 7680x1600, certainly nothing I'd want to play with. So, most users doing that would want CrossFire or SLI anyway. For smaller monitor usage (5760x1080), it kind of begs the question still of whether the bezels, cost, and overall experience are worth the trade of gaining some pixels and an extremely wide FOV (which is the biggest selling point by far... but I wouldn't want to go down from 30" to 3x24" monitors, and going to 3x30" would be extremely costly).

Sure, the thread title is hyperbolic, but the OP does raise some legitimate points despite that.
 
First with DX11 sure, but there sure aren't many games yet that the couple-month span benefited you for having it so early (I say this as a 5870 owner myself). EyeFinity being "first", ever heard of the Matrox Parhelia or TripleHead2Go? This is nothing new whatsoever, it's just being marketed as such... and power use doesn't even factor into most people's decisions: ask what someone's video card uses for wattage and they'll likely stumble out "some?" even if they're an enthusiast-level user ;).

5870 really didn't bring much but extremely high performance and first to the gate with DX11 (what supports it, Dirt 2 and BattleForge today? not going to change much in two months' time anywho)... and, if Fermi's tesselation turns out to be so amazing as they claim, clearly not enough high performance where it matters ;).

Personally I want to see Fermi kick @$$ so that we have some overall price drops... I plan on switching from the 5870 to it unless it's a massive flop, myself, as I dislike the myriad driver glitches I encounter with the 5870 that I've never had on recent nVidia cards. Personal preference, sure... but that's really just an aside: I want more performance too :p.

Matrox was very resolution limited I can't remember exact specs when it came out but it only updated to 1080p res like 2 years ago. So 3 screens Matrox - first, Video card 3 screens no 3rd party hardware - AMD.
Eyeinfinity still blows it away in present.

DX11: gotta start somewhere. Soft devs need the hardware first if they're gonna code it. And AVP is coming out along with STALKER C.O.P along with other games etc etc beating a dead horse with the DX11 game list.
 
The problem with the Matrox solution was that you needed a $300 part as well as a video card that could push the pixels. 5800 series was the first consumer level video card to do it.
 
The problem with the Matrox solution was that you needed a $300 part as well as a video card that could push the pixels. 5800 series was the first consumer level video card to do it.

Parhelia didn't require a part, and there's software called SoftTH that has done it for ages without the external "$300 part" you are talking about.
 
Parhelia didn't require a part, and there's software called SoftTH that has done it for ages without the external "$300 part" you are talking about.

True true. I wouldn't really use Parhelia as a reference though, what a fiasco that was. SoftTH though has been a viable solution.
 
As much as I'm a hardcore gamer, PC gaming is a niche. It just is. Advances in GFX brought on by competition between ATI and NV can only serve to provide a more compelling argument as to why play games on PCs.

I'm all for the competition, and I can't wait to see what NV will bring to the table. Fermi might not change gaming forever, but what it will do is provide somewhat of a rebirth to PC gaming. Will it be the be all end all? Probably not, but we can use all the advantages we can get.

I do agree that Nvidia bringing a more powerful card to the table creates competition and is a GOOD thing because it pushes technology to advance. But we've all seen this before, it is cyclical. Each company will continue to outleapfrog the other. The hardware will get faster and faster.

But the concerns about console influence on pc gaming is valid as well. When you're playing Modern Warfare 2 at 200+ FPS on a 5850 at 1920x1200 you might ask yourself why you'd need more horsepower. Hardcore tech fans will still buy the most powerful cards, but will the average consumer do the same thing?

I believe that it is the midrange and low end cards that end up selling the most and thus being most profitable for video card companies. Thus the 5670 is looked forward to even more than the 5870 or even the 5770 by a lot of people. Fermi might win the performance crown for Nvidia but they likely won't get into the black profit margins until they can release the midrange and lower range versions of Fermi.
 
The PS3 uses a 7800GTX (basically) for its GPU - a DX9 part. And yet, it doesn't have DirectX at all. Sony's next console also won't have DirectX at all. DirectX is proprietary to Microsoft. It doesn't matter if the card supports it, the software doesn't. Sony's next console will continue to be OpenGL (and for what its worth, all of the DX11 features are available in OpenGL as well)
The gpu in the ps3 is a dx 9 part. If you stuck it in a pc it would run dx 9 just fine. That is my point. The hardware is based on dx 9. Btw dx 11 featuers aren't all avalible in the newest open gl spec. It will most likely be the next one that adds them all.

Sonys next part will use an amd/nvidia/power vr part which are all dx parts. Yes because sony wont want to pay ms they will expose these features through open gl but the part is still capable of doing all the same things just in a diffrent api. That is the point.

The next ps3 will have tesselation , compute shaders and other things. All part of the dx 11 spec but they will run through open gl.

Thus when a dev programs for a feature on the radeon 5870 in dx 11 he will know that the gpu in the next xbox and ps3 will be capable of running it.

Do you get the point ?

And how many people will buy *TWO* Fermi cards *AND* have an SLI capable motherboard to run 3 monitors with nFinity? Even fewer than Eyefinity, which also works with the entire 5xxx series and with a single card. You're also ignoring that the 5xxx series brought DX11 to the table, and brought it to an entire range of *affordable* cards. If you think the 5870 didn't bring anything to the table you've basically never read a single review of the card.

Yup thats the point. You can get nice 23 inch dells with display port for $200ish each. So thats $600 for 3 monitors A 5870 will cost you $400. So thats $1000 for a new experiance.

With nvidia. Lets be fair and say the fermi will be $400 . So thats $800 for two gpus. Plus then $600 for the monitors. Thats $1400. But then your going to need an sli mother board and perhaps a better powersupply to run it all. So the costs can be much greater than $1400.
 
The gpu in the ps3 is a dx 9 part. If you stuck it in a pc it would run dx 9 just fine. That is my point. The hardware is based on dx 9. Btw dx 11 featuers aren't all avalible in the newest open gl spec. It will most likely be the next one that adds them all.

Sonys next part will use an amd/nvidia/power vr part which are all dx parts. Yes because sony wont want to pay ms they will expose these features through open gl but the part is still capable of doing all the same things just in a diffrent api. That is the point.

The next ps3 will have tesselation , compute shaders and other things. All part of the dx 11 spec but they will run through open gl.

Thus when a dev programs for a feature on the radeon 5870 in dx 11 he will know that the gpu in the next xbox and ps3 will be capable of running it.

Do you get the point ?



Yup thats the point. You can get nice 23 inch dells with display port for $200ish each. So thats $600 for 3 monitors A 5870 will cost you $400. So thats $1000 for a new experiance.

With nvidia. Lets be fair and say the fermi will be $400 . So thats $800 for two gpus. Plus then $600 for the monitors. Thats $1400. But then your going to need an sli mother board and perhaps a better powersupply to run it all. So the costs can be much greater than $1400.

Good point on the costs.

Either way you dont need a 5870 to use eyefinity, maybe them all running at 1920X1200.

But if you get say 3 1440X900 monitors with 1 5770.....not bad, ALOT cheaper then going the 2 GPU's and an SLI mobo and PSU just to run 3 monitors.

Cost wise....AMD is in the proper position when it comes to triple monitor gaming
 
Last edited:
First with DX11 sure, but there sure aren't many games yet that the couple-month span benefited you for having it so early (I say this as a 5870 owner myself). EyeFinity being "first", ever heard of the Matrox Parhelia or TripleHead2Go? This is nothing new whatsoever, it's just being marketed as such... and power use doesn't even factor into most people's decisions: ask what someone's video card uses for wattage and they'll likely stumble out "some?" even if they're an enthusiast-level user ;).
What you mean Matrox's last card that wasn't competitive with anything else on the market .

There aren't many games but considering amd has been selling dx 11 parts and was at 2m shipped by the end of 2009 and here we are and still no dx 11 nvidia part with only some of their high end coming out in march will ensure that games are based on ati's parts and not nvidias. Esp the ones coming out in the first half of this year.

5870 really didn't bring much but extremely high performance and first to the gate with DX11 (what supports it, Dirt 2 and BattleForge today? not going to change much in two months' time anywho)... and, if Fermi's tesselation turns out to be so amazing as they claim, clearly not enough high performance where it matters ;).
AVP and LoTRO . Also since as we've said ati has sold enough parts already before nvidia shipped one to insure that games are designed based around those parts.

NVidia may or may not have better tesselation performance. It may not and it may not have it across the board. Why would a dev target what is currently 1 video card coming out in late march when there is already a top to bottom video card design that has shipped over 2m and will ship even more before that late launch date. Devs have had ati cards since late last spring.

Personally I want to see Fermi kick @$$ so that we have some overall price drops... I plan on switching from the 5870 to it unless it's a massive flop, myself, as I dislike the myriad driver glitches I encounter with the 5870 that I've never had on recent nVidia cards. Personal preference, sure... but that's really just an aside: I want more performance too :p.
Personaly I hope that Fermi fails so that nvidia starts designing smarter instead of wasting tranistors and turning up the power usage and heat out put.

As far as needing SLI for triple-head gaming, even on a 5870 or single Fermi (probably) you won't have good performance @ 7680x1600, certainly nothing I'd want to play with. So, most users doing that would want CrossFire or SLI anyway. For smaller monitor usage (5760x1080), it kind of begs the question still of whether the bezels, cost, and overall experience are worth the trade of gaining some pixels and an extremely wide FOV (which is the biggest selling point by far... but I wouldn't want to go down from 30" to 3x24" monitors, and going to 3x30" would be extremely costly).
Like with everything else performance is never the best when it first comes out. The point is that for many you can buy your 5870 and your monitors and enjoy gaming and then at another point in time buy another 5870 when it drops in price and enjoy better performance.

But if you think that even two fermi's will let u play dx 11 games with all settings maxed then your kidding yourslef.

I'm going to go with 3x24 monitors. It fits my desk better and I don't have any need for a 30inch monitor nor 3 of them. Mabye in the future when i get a bigger place instead of my current condo.

Sure, the thread title is hyperbolic, but the OP does raise some legitimate points despite that.

What points ? Everything the fermi can do ati did 6 months earlier. Even if you want to argue better performance it doesn't mean much as once again 6 months actuall falls into ati's refresh period and in the first half of the year ati will have at least the high end Hecatoncheires parts come out which depending on your source is either cypress with more shaders tmus and a larger tessellation engine or a brand new design. Either way it will be faster than initial fermi designs.

So what does nvidia bring to the table and if its just better performance how will that be diffrent when 6 months later ati brings out faster parts ? Will those change gaming forever ? Did the 5870 change gaming forever because it was faster than any card before it ? I guess its true everytime a new faster part comes out it changes gaming forever.

But here is the thing. Thanks to ati's line up dx 11 parts for the past 4-5 months we have seen a dx 11 line up that has filed in from top to bottom offering a full range of dx 11 products for people in any price range to buy. Fermi and nvidia has not made any of that possible .

What we do know is that the femi isgoing to be extremely hot and its going to limit who can run these cards and how many people can run and I know that you don't like me quoting charlie so here is another source

http://www.fudzilla.com/content/view/17356/65/

So ati has its top to bottom already out (well in febuary the 55x0 series launches)
Nvidia's is at best late june http://www.fudzilla.com/content/view/17290/65/ and even he isn't sure cause he said they might even be later.

All this is going to do is let amd sell eer more cards and developers to target those cards even more.
 
What you mean Matrox's last card that wasn't competitive with anything else on the market .

There aren't many games but considering amd has been selling dx 11 parts and was at 2m shipped by the end of 2009 and here we are and still no dx 11 nvidia part with only some of their high end coming out in march will ensure that games are based on ati's parts and not nvidias. Esp the ones coming out in the first half of this year.

AVP and LoTRO . Also since as we've said ati has sold enough parts already before nvidia shipped one to insure that games are designed based around those parts.

NVidia may or may not have better tesselation performance. It may not and it may not have it across the board. Why would a dev target what is currently 1 video card coming out in late march when there is already a top to bottom video card design that has shipped over 2m and will ship even more before that late launch date. Devs have had ati cards since late last spring.

Personaly I hope that Fermi fails so that nvidia starts designing smarter instead of wasting tranistors and turning up the power usage and heat out put.

Like with everything else performance is never the best when it first comes out. The point is that for many you can buy your 5870 and your monitors and enjoy gaming and then at another point in time buy another 5870 when it drops in price and enjoy better performance.

But if you think that even two fermi's will let u play dx 11 games with all settings maxed then your kidding yourslef.

I'm going to go with 3x24 monitors. It fits my desk better and I don't have any need for a 30inch monitor nor 3 of them. Mabye in the future when i get a bigger place instead of my current condo.



What points ? Everything the fermi can do ati did 6 months earlier. Even if you want to argue better performance it doesn't mean much as once again 6 months actuall falls into ati's refresh period and in the first half of the year ati will have at least the high end Hecatoncheires parts come out which depending on your source is either cypress with more shaders tmus and a larger tessellation engine or a brand new design. Either way it will be faster than initial fermi designs.

So what does nvidia bring to the table and if its just better performance how will that be diffrent when 6 months later ati brings out faster parts ? Will those change gaming forever ? Did the 5870 change gaming forever because it was faster than any card before it ? I guess its true everytime a new faster part comes out it changes gaming forever.

But here is the thing. Thanks to ati's line up dx 11 parts for the past 4-5 months we have seen a dx 11 line up that has filed in from top to bottom offering a full range of dx 11 products for people in any price range to buy. Fermi and nvidia has not made any of that possible .

What we do know is that the femi isgoing to be extremely hot and its going to limit who can run these cards and how many people can run and I know that you don't like me quoting charlie so here is another source

http://www.fudzilla.com/content/view/17356/65/

So ati has its top to bottom already out (well in febuary the 55x0 series launches)
Nvidia's is at best late june http://www.fudzilla.com/content/view/17290/65/ and even he isn't sure cause he said they might even be later.

All this is going to do is let amd sell eer more cards and developers to target those cards even more.

DO NOT forget BF:BC2 which the beta opens up next weekend. IT is DX11 and you can now pre-order it on steam.

CANNOT wait to get 2 5870's in xfire for this baby :)
 
Good point on the costs.

Either way you dont need a 5870 to use eyefinity, maybe them all running at 1920X1200.

But if you get say 3 1440X900 monitors with 1 5770.....not bad, ALOT cheaper then going the 2 GPU's and an SLI mobo and PSU just to run 3 monitors.

Cost wise....AMD is in the proper position when it comes to triple monitor gaming

Yup and its silly because a geforce 8800 couldn't run all games at 2560x1600 but i'm sure people still bought it and cranked down settings as they needed
 
Holy....fucking....shit. People. Calm down! Can anyone learn to read? Can anyone have an opinion in this day and age? What does a PS3 or GTX 7800 have to do with Fermi? Christ.


1. Fermi is not out yet. This thread was not posted PRAISING an unreleased cards performance, I posted this praising the purported FEATURE SET and CAPABILITY. Again, I posted this thread because I merely felt that it's going to change gaming in some way, good or bad, after reading up extensively on it. Again, I am not an Ati or Nvidia fanboy, period. But imho, it's going to be in a GOOD way.

2. Nvidia could fail BIG TIME with this card. It is a gamble. I have been very disappointed in Nvidia's shady way of going about information on this card. The shady way it was presented, the conflicting info lately, and lack of true benchmarks or release date. It's gotten very old. They need to get this out of the door. Does that mean Nvidia will fail and go away as a company? Of course not. They had an utterly miserable epic fail with the dustbuster 5800 cards. Ati mopped the floor with them that quarter.

Did Nvidia die? No. They came back with a vengeance and mopped the floor with future gpu's like the 8800 GTX and GTX 280. Nvidia will not die if Fermi is a disappointment. The loyalists and uninformed will buy it based on brand recognition alone. And you cannot deny Nvidia's brand recognition. There are still plenty of people that buy a Geforce 8800 GTX and think it's hot shit because it's a "Geforce". The low end / gpu market alone can keep Nvidia afloat.

3. Those denying the capability of Fermi are misinformed or Ati loyalists - Here is a card that, according to the article on our very own Hardocp, is a geometry power house. Thats gotta be good for something. Ati may have been first to bat with the DX11, the Eyefinity, the Tessellation. But Nvidia wants to do all that, and do it better. Well maybe not Eyefinity, but thats a niche market and anyone saying it isn't, is in denial. Triple monitor gaming is definitely more viable than it was before. Eyefinity is awesome technology, but the amount of persons utilizing it, wanting to utilize it, or having the capability to do so is a very small amount at the moment.

The truth is, we as gamers want speed. Speed, image quality, and new features. Whatever combines this usually gets the dollars. Ati's 5870 is an amazing card. I should know....I'm running one in my room right now at 1015/1290 and it's incredible. This was and IS the card to get at the moment, period. Now say Nvidia does release Fermi very soon and it does live up to the hype. It arrives faster overall, faster under heavy geometry, and doesn't disappoint?

Will I pass it up because I own this 5870? Absolutely not. Now Fermi will be DX11, it has tons of geometry specialized functions, and is already claimed faster at the moment (even based on observation from hardocp). Of course until we get cold, hard, numbers and facts, this is all speculation. I personally think Nvidia would be stupid to lie and inflate facts this round. They know Ati means business. They know they didn't get away with the lies and deciet this time like the dustbuster. I think Nvidia will live up to the bold claims, however so small it may be actually ahead of the 58XX/59XX on speed or features.

4. We as consumers and gamers want this card to succeed. Anyone opposed to this card succeeding, will suffer. We need this card to be successful to keep the war between Ati and Nvidia going. This war brings lower costs, better performance, and choice to us consumers. Fermi is good for us all in this respect. Fermi kicks ass, Ati has to compete better. Fermi kicks ass, Ati lowers prices. That means cheap 5870 for you. Fermi sucks Ati will lower prices to make Nvidia look stupid over lackluster price / performance ratio, so theres that cheap 5870 coming again. It's all good.

5. I believe the future this generation / decade will be about advancing Geometry. Yes Ati has had it awhile. Yes they can do it now. But Nvidia wants to do it faster and do that faster NOW. This is an important technical leap. We need power to back up geometry processing. Fermi is an important technological leap in my opinion because it is trying to do this now. This will change the way games look in the future in a shorter timeframe. We will need all the geometry powerhouse we can get for games like ID Software's Rage. Remember Doom 3? I seem to recall ID Software being pissed over the Ati cards and writing a GL path mainly for Nvidia cards. This could and will be helpful for Nvidia. You cannot deny the power of this technological empire and the way they can get devs to do what they want. Money talks. Brand recognition talks more.



So Ati loyalists, what do you have to fear Fermi for? It's either going to come and beat the 5870/5970, tie it unremarkably, or just downright suck and come with too much power requirements and a nice feature set, but an unremakable performance (think Geforce FX 5XXX series). Either way, a true gamer, a true enthusiest, and a true multi company supporter, will greet Fermi with open arms. All others are loyalists and will reap what they sew.

I have seen and used crappy cards from both companies. In my opinion, Nvidia really fucked up with the FX dustbuster series (5XXX). Shitty shitty driver deception, shitty pixel shader performance, a dustbuster lawnmower cooler, you name it. It sucked ASS. Likewise, Ati's Radeon 8500 was underwhelming to me. It used too much power, had issues with certain power supplies iirc, had image quality issues that needed ironed out, Truform was worthless, not impressively faster than the Geforce 3 Ti 500 either. So Ati fucked this card up in my opinion. Both came and did their thing. Both companies are still here today. Proof that a single card will not kill a company if it doesn't live up to the hype.
 
Last edited:
Holy....fucking....shit. People. Calm down! Can anyone learn to read? Can anyone have an opinion in this day and age? What does a PS3 or GTX 7800 have to do with Fermi? Christ.


1. Fermi is not out yet. This thread was not posted PRAISING an unreleased cards performance, I posted this praising the purported FEATURE SET and CAPABILITY. Again, I posted this thread because I merely felt that it's going to change gaming in some way, good or bad, after reading up extensively on it. Again, I am not an Ati or Nvidia fanboy, period. But imho, it's going to be in a GOOD way.

2. Nvidia could fail BIG TIME with this card. It is a gamble. I have been very disappointed in Nvidia's shady way of going about information on this card. The shady way it was presented, the conflicting info lately, and lack of true benchmarks or release date. It's gotten very old. They need to get this out of the door. Does that mean Nvidia will fail and go away as a company? Of course not. They had an utterly miserable epic fail with the dustbuster 5800 cards. Ati mopped the floor with them that quarter.

Did Nvidia die? No. They came back with a vengeance and mopped the floor with future gpu's like the 8800 GTX and GTX 280. Nvidia will not die if Fermi is a disappointment. The loyalists and uninformed will buy it based on brand recognition alone. And you cannot deny Nvidia's brand recognition. There are still plenty of people that buy a Geforce 8800 GTX and think it's hot shit because it's a "Geforce". The low end / gpu market alone can keep Nvidia afloat.

3. Those denying the capability of Fermi are misinformed or Ati loyalists - Here is a card that, according to the article on our very own Hardocp, is a geometry power house. Thats gotta be good for something. Ati may have been first to bat with the DX11, the Eyefinity, the Tessellation. But Nvidia wants to do all that, and do it better. Well maybe not Eyefinity, but thats a niche market and anyone saying it isn't, is in denial. Triple monitor gaming is definitely more viable than it was before. Eyefinity is awesome technology, but the amount of persons utilizing it, wanting to utilize it, or having the capability to do so is a very small amount at the moment.

The truth is, we as gamers want speed. Speed, image quality, and new features. Whatever combines this usually gets the dollars. Ati's 5870 is an amazing card. I should know....I'm running one in my room right now at 1015/1290 and it's incredible. This was and IS the card to get at the moment, period. Now say Nvidia does release Fermi very soon and it does live up to the hype. It arrives faster overall, faster under heavy geometry, and doesn't disappoint?

Will I pass it up because I own this 5870? Absolutely not. Now Fermi will be DX11, it has tons of geometry specialized functions, and is already claimed faster at the moment (even based on observation from hardocp). Of course until we get cold, hard, numbers and facts, this is all speculation. I personally think Nvidia would be stupid to lie and inflate facts this round. They know Ati means business. They know they didn't get away with the lies and deciet this time like the dustbuster. I think Nvidia will live up to the bold claims, however so small it may be actually ahead of the 58XX/59XX on speed or features.

4. We as consumers and gamers want this card to succeed. Anyone opposed to this card succeeding, will suffer. We need this card to be successful to keep the war between Ati and Nvidia going. This war brings lower costs, better performance, and choice to us consumers. Fermi is good for us all in this respect. Fermi kicks ass, Ati has to compete better. Fermi kicks ass, Ati lowers prices. That means cheap 5870 for you. Fermi sucks Ati will lower prices to make Nvidia look stupid over lackluster price / performance ratio, so theres that cheap 5870 coming again. It's all good.

5. I believe the future this generation / decade will be about advancing Geometry. Yes Ati has had it awhile. Yes they can do it now. But Nvidia wants to do it faster and do that faster NOW. This is an important technical leap. We need power to back up geometry processing. Fermi is an important technological leap in my opinion because it is trying to do this now. This will change the way games look in the future in a shorter timeframe. We will need all the geometry powerhouse we can get for games like ID Software's Rage. Remember Doom 3? I seem to recall ID Software being pissed over the Ati cards and writing a GL path mainly for Nvidia cards. This could and will be helpful for Nvidia. You cannot deny the power of this technological empire and the way they can get devs to do what they want. Money talks. Brand recognition talks more.



So Ati loyalists, what do you have to fear Fermi for? It's either going to come and beat the 5870/5970, tie it unremarkably, or just downright suck and come with too much power requirements and a nice feature set, but an unremakable performance (think Geforce FX 5XXX series). Either way, a true gamer, a true enthusiest, and a true multi company supporter, will greet Fermi with open arms. All others are loyalists and will reap what they sew.

I have seen and used crappy cards from both companies. In my opinion, Nvidia really fucked up with the FX dustbuster series (5XXX). Shitty shitty driver deception, shitty pixel shader performance, a dustbuster lawnmower cooler, you name it. It sucked ASS. Likewise, Ati's Radeon 8500 was underwhelming to me. It used too much power, had issues with certain power supplies iirc, had image quality issues that needed ironed out, Truform was worthless, not impressively faster than the Geforce 3 Ti 500 either. So Ati fucked this card up in my opinion. Both came and did their thing. Both companies are still here today. Proof that a single card will not kill a company if it doesn't live up to the hype.

I'll buy a Fermi you convinced me.
 
Nvidia branding!

Nvidia makes everything better.

Eyeinfinity doesn't count < Nfinity

DX11 Tessellation < Nvidia does it better

3d surround brilliant and revolutionary. 60hz vs 120 hz we will have less flickering weeeee!

Performance < Nvidia everyone knows according to what they said at CES even a midrange Gt360 will compete with 5970. Add to that the 30% performance leaked bench in FC2 proves this shutup haters and AMD loyalists eat it.

Those benches must be the midrange card why would Nvidia showcase their flagship card...durrr ?
It doesn't matter that its 6 months late it will destroy AMD's refresh.

Lastly: GPGPU Compute!! medical imaging and double precision math eat it amd fans muahaha


Run and shake in your boots you know you're all scared.
gtx380.jpg
 
Though I hope nVidia at least make a better looking stock cooler than the 5xxx. Shouldn't be too hard :) I'd be buying a 5870 right now if I had at least a semi-solid idea of it's noise level. That's the only thing that's keeping me from going ATI again. I had 3 of their 4870's, all their included coolers were atrociously loud :(
 
Nvidia makes everything better.

Eyeinfinity doesn't count < Nfinity

DX11 Tessellation < Nvidia does it better

3d surround brilliant and revolutionary. 60hz vs 120 hz we will have less flickering weeeee!

Performance < Nvidia everyone knows according to what they said at CES even a midrange Gt360 will compete with 5970. Add to that the 30% performance leaked bench in FC2 proves this shutup haters and AMD loyalists eat it.

Those benches must be the midrange card why would Nvidia showcase their flagship card...durrr ?
It doesn't matter that its 6 months late it will destroy AMD's refresh.

Lastly: GPGPU Compute!! medical imaging and double precision math eat it amd fans muahaha


Run and shake in your boots you know you're all scared.
gtx380.jpg

1eudk9.jpg
 
Someone remind me again what makes Fermi better than the current 5xxx series???
 
You hate AMD becuase you're a fanboy, nothing more nothing less. They've come out with poweful cards with new technology that use next to no power in 2D mode... And you're complaining?

i have owed the 9700 pro 1900xt. fanboyism has nothing to do with it. their new direction in gpu almost dicticates that no progress is made in the GPU era. it is the same strageagy they employed with they made thier cpu's before atholons. it was about being cheaper and doing 80% of what intel could do almost as fast.

the gpu world is not that far advnaced that just being faster is good enough, so while i do consider 57xx nice products, that they don't do anything special, nothing about them changes anything, they are just nice products.

fermi is way more impressive it pushes the bounderies, and ultimately nvidia will be able to make it cheaper and better, much like intel was able to do with the core2duo design.
 
Wow people forget the FX5800 so fast. I like Nvidia as much as the next guy but they have put out some dogs before, so you just never know. The lack of any REAL benchies makes me still have my doubts.

Like most people, if they have a superior product at a competitive price, I'll hop ship from ATI and buy it, BUT I won't buy it just because it says Nvidia on it (and I've run both so not a fanboi either way).
 
i have owed the 9700 pro 1900xt. fanboyism has nothing to do with it. their new direction in gpu almost dicticates that no progress is made in the GPU era. it is the same strageagy they employed with they made thier cpu's before atholons. it was about being cheaper and doing 80% of what intel could do almost as fast.

the gpu world is not that far advnaced that just being faster is good enough, so while i do consider 57xx nice products, that they don't do anything special, nothing about them changes anything, they are just nice products.

fermi is way more impressive it pushes the bounderies, and ultimately nvidia will be able to make it cheaper and better, much like intel was able to do with the core2duo design.

You could of just said "Fermi will own cuz I said so". Would've just made it easy on yourself instead of typing a whole paragraph.

Let's all forget about yield per wafer, 550mm^2, AMD's more mature fab process, more scale down friendly chip,...none of that matters guys. And no this isn't about fanboyism I bleed green and will never buy AMD. :p
btw Fermi will push bounderies cuz their marketing said so.
 
There's so much nonsense in here it made a small disturbance in the force, and now I have to look at it.

There is no way this card is going to "change gaming forever." It doesn't bring anything new to the table for gaming that ATI doesn't also have in their HD5000 series. They both do OpenCL+DirectCompute, they both do DX11, they both support multiple monitors, they both do tessellation (which ATI cards have technically had the ability to do since forever it seems)... If it performs better then OMGWOWZA!!?! The featureset between the two is almost exactly the same, with very very few differences, all but two of which don't really affect gaming. Physx is such a little ripple considering OpenCL and DirectCompute will probably be more widely adopted, and 3D is ridiculously expensive. Why spend $700 on a single monitor and some 3D glasses when you can spend $600 on 3 monitors?

Performance alone doesn't change anything, we sorta already expected that it would have to be some sort of monster for Nvidia to be 6 months late and go through this much bad mojo. Being 6 months late means ATI has already had ample time to be pushing the design for their next releases, so with yields improving Nvidia needs to hope it can fill in the mid-range and low-range markets otherwise ATI is going to continue their streak of "we're better than you in every market except the fanboy and rich people markets." Price drop when Fermi goes retail? Somewhat likely. Prolly not a big one, but if the price goes down $10-20 across the current ATI DX11 product range,

This thread is idiotic, especially since nobody knows how much this thing costs, how much energy it uses, how much heat it creates, and what it's actually capable of in non-biased testing.
 
i have owed the 9700 pro 1900xt. fanboyism has nothing to do with it. their new direction in gpu almost dicticates that no progress is made in the GPU era. it is the same strageagy they employed with they made thier cpu's before atholons. it was about being cheaper and doing 80% of what intel could do almost as fast.

the gpu world is not that far advnaced that just being faster is good enough, so while i do consider 57xx nice products, that they don't do anything special, nothing about them changes anything, they are just nice products.

fermi is way more impressive it pushes the bounderies, and ultimately nvidia will be able to make it cheaper and better, much like intel was able to do with the core2duo design.

Except that you don't know enough about Fermi to make some of these claims. As it stands right now, AMD is doing MORE than nVidia, doing it FASTER than nVIdia and doing it at a lower cost as well. (5850 vs 285)

You're ignoring all the new features and benefits that you KNOW the 58xx has and touting things that are still speculative about Fermi, while at the same time ignoring any possible drawbacks. That makes you a fanboy, regarless of what cards you owned 5 years ago.
 
Except that you don't know enough about Fermi to make some of these claims. As it stands right now, AMD is doing MORE than nVidia, doing it FASTER than nVIdia and doing it at a lower cost as well. (5850 vs 285)

You're ignoring all the new features and benefits that you KNOW the 58xx has and touting things that are still speculative about Fermi, while at the same time ignoring any possible drawbacks. That makes you a fanboy, regarless of what cards you owned 5 years ago.

what benefits? eyeinfinty don't care same with nvidia version. the 58xx does not let me do anything i couldn't with my 24 inch monitor that i couldnt do before.

it nice, its fast, but it's also lazy, it pushes zero bounderies. sorry but uses 3 monitors, lame, same as it is for nvidia, give new features, that i can't do right now. Fermi regardless of how fast it is, is a good step in the right direction.

you and i clearly have differnt expection for our gpu hardware, you want fast and cheap, that works for you, it doenst work for me, i want gpu hardware to continually push the boundaries of what is possible.
 
what benefits? eyeinfinty don't care same with nvidia version. the 58xx does not let me do anything i couldn't with my 24 inch monitor that i couldnt do before.

it nice, its fast, but it's also lazy, it pushes zero bounderies. sorry but uses 3 monitors, lame, same as it is for nvidia, give new features, that i can't do right now. Fermi regardless of how fast it is, is a good step in the right direction.

you and i clearly have differnt expection for our gpu hardware, you want fast and cheap, that works for you, it doenst work for me, i want gpu hardware to continually push the boundaries of what is possible.

ATI 8500: First with Tessellation
ATI 3xxx \series: First with DX 10.1
ATI 5xxx series: First with DX 11, bonus Eyefinity if you want it
NVidia 8xxx series: First with DX 10m bonus CUDA (later PhysX)
Nvidia 2xx series; First with absolutely nothing
Nvidia Fermi: First with absolutely nothing - finally has tessllation! finally has DX 10.1! Finally has DX11!

Yet Nvidia is the one pushing boundries? Guffaws all around! :rolleyes:
 
You can continue to say the same thing in three different ways (four if you decide to reply again). The only constant is that you're still ignoring new and proven features AMD has and touting unproven features of Fermi. Clearly, if you ignore new features, what remains is going to seem like it's the same old stuff. How convenient if must be to ignore things that make your position flat out wrong.
 
ATI 8500: First with Tessellation
ATI 3xxx \series: First with DX 10.1
ATI 5xxx series: First with DX 11, bonus Eyefinity if you want it
NVidia 8xxx series: First with DX 10m bonus CUDA (later PhysX)
Nvidia 2xx series; First with absolutely nothing
Nvidia Fermi: First with absolutely nothing - finally has tessllation! finally has DX 10.1! Finally has DX11!

Yet Nvidia is the one pushing boundries? Guffaws all around! :rolleyes:

You forgot ATI also pushed GDDR4 and 5.
 
kllrnohj, your GPU history is sorely lacking. How about first with 32-bit, first with hardware T&L, first with a cross-bar memory bus, first with programmable shaders, first with SM3.0, first with unified shaders (on the pc), first with a proper general compute implementation...etc, etc.

Fermi brings nothing new from a graphics standpoint because features are defined by Microsoft - duh. The only thing the IHV's can differentiate on is IQ and performance. Nvidia is tackling the first one with coverage sampling support for alpha textures and faster sparse sampling for soft shadow mapping. Performance is yet to be seen.

Architecturally however, Fermi brings a lot of firsts to GPUs. Just having a full coherent read/write memory hierarchy is a big deal for GPUs. There's also the big overhaul in the geometry pipeline. Some people don't really care about that stuff but for those that do, there are a lot of big firsts in Fermi that shed some light on the future of GPU architectures as they move closer to the flexibility and programmability of CPUs.
 
Being the nVidia advocate that you are, i'm sure you already know that this architecture difference is merely a side effect of nVidia's attempt to bring GPGPU to the masses. You have no idea how that will translate to gaming, it may even hinder performance if all these "features" are not being used and just taking up space on the die that could have been used for things that will actually be used in games. That's my whole point, you're touting unproven features as a sure thing which they hardly are. It would be GREAT if we know that all these "enhancements" would actually be put to good use, but we dont.

Different doesn't always mean better. With the development time it takes to come out with a good game, it can be quite a while before we see these "new features" actually put to use in anything other than nVIdia tech demos.
 
Back
Top