BSN nVidia GF100 Fermi silicon cost analysis, more positive then before

I'm sure Charlie D will find a way to put a negative spin on it.

Seems like Fermi is not "unmanufacturable" after all.

yeah but if he didn't, he wouldn't get the page hits he does
when will people stop feeding the troll and let the site die :eek:

Your prices are wrong because I say they are. I my tabloid sorce that I'm not going to bother linking is better than your tabloid source! Seriously? Give us a break.

nah dude, because he insisted it THREE times, it might be the bonafide truth (also notice how its roughly 1:1 what fud has been saying, it obviously must be true) :rolleyes:

Last I checked, both cards run DX11. PS3 and Xbox comparison is no relevant in the least.

but PS3 is a supercomputer in your hand, it has a lot of cores and cores and because its HD it has more cores and sony told me its a supercomputer theres no way three to four year old tech can be outdated, they said its the best ever :p




Read the reviews, the G92 despite being touted as "old tech" still keeps up with the 5770 just fine. Infact, in several case it was faster.

but its old and therefore the devil ;)

Yep, Nvidia is going out of buisness tommorow. :rolleyes:

of course, haven't you heard, from un-named insider sources they're trying to get a bailout package to stay in business, but its looking unlikely and they're entirely going out of business in spite of superior market share, their bullet-proof competition didn't just split apart for financial reasons either, that was just because they knew they could "fight" nVidia with one hand and they wanted to show everyone how tough they are


/SARCASM (sorry for anyone who finds this via Google or whatever and wastes an hour trying to verify any/all of the above)
 
It is a blood letting. They are totaly off on ati's prices.

The break down by others who know put a complete hemlock board at roughly the same price as the gts 360 part that will launch in late March. Which is around $280 for a complete board (ram ,cooling , pcb and what not)

The 5870 is less $150 according to the same sources.

That means price wise amd/ati can send the hemlock out to fight against the gts 360 in terms of pricing.

But they wont. Because they will simply put a refresh against the gts 360 if it proves much faster than the 5870 . Most likely the refresh will have 2 gigs of ram also.

let me rephrase then, its not the bloodletting that we were thinking it was going to be. this is going to be without question a rough cycle for them but if the article holds up at all (as well as some others) then there is some hope that they are not going to simple fall on their faces here.
 
let me rephrase then, its not the bloodletting that we were thinking it was going to be. this is going to be without question a rough cycle for them but if the article holds up at all (as well as some others) then there is some hope that they are not going to simple fall on their faces here.

I really have an issue with the notion that nVidia has somehow "collapsed" or will utterly "fail"

this is getting pretty repetitive

we heard this about nVidia when FX 5800 was late
we heard this about ATi when HD 2900 was late
we heard this about AMD when Phenom was late
we heard this about Intel when Pentium D fizzled

all of those products were basically "failures", any of those companies out of business? any of them lose much business as a result? how many people actually bought GeForce FX boards? etc

its three months people
not a decade
 
Last I checked, both cards run DX11. PS3 and Xbox comparison is no relevant in the least.
Was the comparison too complicated for you? Or are you being anal and obtuse for the sake of winning an argument to make you feel better? Here, I'll spell it out for you :)

When you have the only DirectX 11 development platform for nearly a year and your installed base is over 2 million, your users will benefit. Let me know if there are any big words you want me to look up for you :)

Nvidia does not require 120hz monitor or 3d glasses to run eyefinity. It requirest 120hz monitors and 3D glasses to run 3D eyenfinity. Futhermore, telling someone who just bought 3x 24" monitors and plans to game at 3600x1920 or 5760x1200 or higher that SLI is a "barrier" is a bit... dumb.
Another anal and obtuse argument, this time with a veiled insult. That's an improvement at least. Let me spell this out for you again :)

You are describing a user who already burned the cash for triple display gaming. They are not the norm. Users upgrading their current triple display gaming setups are at best a niche within a niche. Thanks for proving my point.

If any user is going to buy INTO the whole triple display and 3d gaming package they will have 3 choices.
1. Eyefinity: Works out of the box with a single card, no special monitors required, no special proprietary 3d glasses, not limited to a particular size of monitor.
2. Current Nvidia owners: Buy another card to SLI + Expensive high refresh monitors + expensive proprietary 3d glasses. Caveats are sticking with old technology missing out on the maximum eye candy on the fresh batch of DirectX 11 games coming this year and being stuck with 24" monitors.
3. Customers waiting for Fermi: Buy 2x Fermi, buy power supply rated to handle them, buy 3x 120hz monitors limited to 24" and buy expensive proprietary 3d glasses.

For a user who does not care about 3d gaming and is only looking to experience triple display gaming Eyefinity is an even better choice.
1. Eyefinity: Works out of the box and not limited to 24" displays. Works with a single card. Can upgrade performance down the road by crossfiring a second card but then comes the implied cost of possibly needing a new PSU to handle them.
2. Current Nvidia user: Buy a second card to SLI and possibly a new power supply to handle them. They'll still have to live with the same caveats of old tech, not being able to experience DirectX 11, etc. It is not known if their version of Eyefinity is limited to a certain display size so I'll assume it isn't. Current SLI owners are in the best position in this regard, all they lose is DirectX 11 but would still have to buy 2 more monitors. That may be the only saving grace for Nvidia's eyefinity since owners of AMD's last generation tech are forced to upgrade to the next generation to experience it.
3. Users waiting for Fermi: Buy 2x Fermi and the power supply required to run them and 2 more monitors. Again, it is not known if their version of eyefinity is limited to a certain display size.

In all scenarios AMD's solution is more accessible due to lower price and less restrictions and offering more flexibility. The money one saves from not buying a second card to SLI can be used instead to buy bigger displays. Also all users opting for option 3 are much smaller in number since the cost of that setup is far greater than any of the other choices.

If AMD adds 3D support, they will still need 120Hz monitors.
Proof? All the info that's known is the upcoming driver addition of support for 3d glasses that work on standard refresh monitors.

I fully expect ATI to drop prices at the Nvidia launch date.
This will be even better for consumers if it does and will increase AMD's price/performance advantage, they have no pressure to do it tho.

Last time I checked Nvidia had lots of these. For example here is EVGA's classified 285.
Funny you didn't include a link to a place where one can buy that example :)
For less than the $420 that card sells at newegg a user can buy a triple display out of the box, Directx 11 capable, HD5870 card. If it is such a great alternative go ahead and buy one to prove me wrong ;)

Read the reviews, the G92 despite being touted as "old tech" still keeps up with the 5770 just fine. Infact, in several case it was faster.
Tsk, tsk,tsk. That would be nice and all if you ignore the fact that Nvidia released a fresh batch of Directx 10.1 40nm chips to compete in the low end. These chips are simply outperformed and outclassed(price, features and power consumption wise). Even last year's AMD's offerings are better in some cases.

Yep, Nvidia is going out of buisness tommorow. :rolleyes:
Take a deep breath and relax. Can you point to where I said or implied such a thing?
 
When you have the only DirectX 11 development platform for nearly a year and your installed base is over 2 million, your users will benefit. Let me know if there are any big words you want me to look up for you :)
When you have 2 million units DX11 and ~70M consoles sold (not counting the 60+million Wiis) + all the install base of DX9+DX10 cards, 2M cards looks pretty damn small. Furthermore perhaps this is hard for you to understand, but programmers program to DX11 not for Nvidia or ATI hardware specifically. They may "tweak" to one of the other, but it is NOTHING like writing code for a cell vs a more classic x64ish processor found in the Xbox.



You are describing a user who already burned the cash for triple display gaming. They are not the norm. Users upgrading their current triple display gaming setups are at best a niche within a niche. Thanks for proving my point.

If any user is going to buy INTO the whole triple display and 3d gaming package they will have 3 choices.
1. Eyefinity: Works out of the box with a single card, no special monitors required, no special proprietary 3d glasses, not limited to a particular size of monitor.
2. Current Nvidia owners: Buy another card to SLI + Expensive high refresh monitors + expensive proprietary 3d glasses. Caveats are sticking with old technology missing out on the maximum eye candy on the fresh batch of DirectX 11 games coming this year and being stuck with 24" monitors.
3. Customers waiting for Fermi: Buy 2x Fermi, buy power supply rated to handle them, buy 3x 120hz monitors limited to 24" and buy expensive proprietary 3d glasses.

For a user who does not care about 3d gaming and is only looking to experience triple display gaming Eyefinity is an even better choice.
1. Eyefinity: Works out of the box and not limited to 24" displays. Works with a single card. Can upgrade performance down the road by crossfiring a second card but then comes the implied cost of possibly needing a new PSU to handle them.
2. Current Nvidia user: Buy a second card to SLI and possibly a new power supply to handle them. They'll still have to live with the same caveats of old tech, not being able to experience DirectX 11, etc. It is not known if their version of Eyefinity is limited to a certain display size so I'll assume it isn't. Current SLI owners are in the best position in this regard, all they lose is DirectX 11 but would still have to buy 2 more monitors. That may be the only saving grace for Nvidia's eyefinity since owners of AMD's last generation tech are forced to upgrade to the next generation to experience it.
3. Users waiting for Fermi: Buy 2x Fermi and the power supply required to run them and 2 more monitors. Again, it is not known if their version of eyefinity is limited to a certain display size.

In all scenarios AMD's solution is more accessible due to lower price and less restrictions and offering more flexibility. The money one saves from not buying a second card to SLI can be used instead to buy bigger displays. Also all users opting for option 3 are much smaller in number since the cost of that setup is far greater than any of the other choices.

Proof? All the info that's known is the upcoming driver addition of support for 3d glasses that work on standard refresh monitors.
A 5870 does not have the power to max settings at a 3x24" resolution. Furthermore, if you try to run stereoscopic 3D on a 60hz monitor you will see 30 effective fps as one screen is displayed to the "right" and one to the "left" "channels". Any ATI user who wants to have higher than 30 effective fps will have to have 120hz monitors. This is not an Nvidia limitation it is a physical one. The ONLY thing Nvidia's solution requires that ATI's doesn't is it requires SLI. Which is something you will want if you are running 3x24" screens or something that is ~70% more pixels than 2560x1600.


Funny you didn't include a link to a place where one can buy that example :)
For less than the $420 that card sells at newegg a user can buy a triple display out of the box, Directx 11 capable, HD5870 card. If it is such a great alternative go ahead and buy one to prove me wrong ;)
Uhhh maybe in that link where is says "BUY NOW"? What does that have to if non-reference boards are avaliable from Nvidia or not?

Have a nice day in your world of ATI fandom and rage though.
 
Your prices are wrong because I say they are. I my tabloid sorce that I'm not going to bother linking is better than your tabloid source! Seriously? Give us a break.

ITs more than just one tabloid source. You should go to a forum with more professionals on it. You could learn a few things.

Notice the source your all kneeling down in front of to get off doesn't state how much of nvidia's yield are the castrated gts360 parts and how many are the gtx 380 parts.
 
I really have an issue with the notion that nVidia has somehow "collapsed" or will utterly "fail"

this is getting pretty repetitive

we heard this about nVidia when FX 5800 was late
we heard this about ATi when HD 2900 was late
we heard this about AMD when Phenom was late
we heard this about Intel when Pentium D fizzled

all of those products were basically "failures", any of those companies out of business? any of them lose much business as a result? how many people actually bought GeForce FX boards? etc

its three months people
not a decade

But we Didn't hear it or even think of it when 3dfx Voodoo5 was late and slow and they had a Voodoo5 6k that was big, fast, expensive, and used enough power to require an external power brick :)

anyone who says Nvidia is going out of business is a fool, but don't think this may not make them a little wiser in the future.

What i am really interested in hearing about is the supposidly "failed technology" that Nvidia had that forced them to pull Fermi off the shelf and make it their new tech.
 
But we Didn't hear it or even think of it when 3dfx Voodoo5 was late and slow and they had a Voodoo5 6k that was big, fast, expensive, and used enough power to require an external power brick :)
.

true.
lol.
 
But we Didn't hear it or even think of it when 3dfx Voodoo5 was late and slow

Yes we did hear it back then... :eek: . However, most didn't know the speculation existed since most didn't use forums like these back then, but we all thought prior to launch that the V5 5500 wouldn't hold 3dFX over.
 
When you have 2 million units DX11 and ~70M consoles sold (not counting the 60+million Wiis) + all the install base of DX9+DX10 cards, 2M cards looks pretty damn small. Furthermore perhaps this is hard for you to understand, but programmers program to DX11 not for Nvidia or ATI hardware specifically. They may "tweak" to one of the other, but it is NOTHING like writing code for a cell vs a more classic x64ish processor found in the Xbox.

WHen you look at total dx 11 market 2m chips looks like this


http://forum.beyond3d.com/attachment.php?attachmentid=396&stc=1&d=1262518153


Then you might explain how ati's domination of dx 11 wont continue for at least another 2 months till nvidia gets a single card on the market (considering the 2m number was before chirstmas).

Also how nvidia will start gaing back the lions share of the market with just two high end gpus while amd is pushing to ever lower prices (55x0s under $50 wll be here in febuary) before nvidia even launchs a single dx 11 gpu.

Have nvidia even said when mid and low end fermis will hit the market ? How about dx 11 igp's or laptop parts ?

Ati is just going to continue gaining market share wit hthe 5x00 series. Lets face it the 5x00series is going to be just like the g80 for a long time to come. Due to its jump on the market and its great performance for its price its going to get the lion share of the market and devs to target it. Its not very hard to understand. THats what happens when you come out with a well performing part that has a low heat output , low power draw and a new api's feature set built in.

A 5870 does not have the power to max settings at a 3x24" resolution. Furthermore, if you try to run stereoscopic 3D on a 60hz monitor you will see 30 effective fps as one screen is displayed to the "right" and one to the "left" "channels". Any ATI user who wants to have higher than 30 effective fps will have to have 120hz monitors. This is not an Nvidia limitation it is a physical one. The ONLY thing Nvidia's solution requires that ATI's doesn't is it requires SLI. Which is something you will want if you are running 3x24" screens or something that is ~70% more pixels than 2560x1600.

A 5870 doesn't need to run max settings on 3x24 inch resolution. It does the job to get people to start moving towards tri monitor gaming. I bet you anything the 5870, 5980 and fermi wont have enough power to drive a single 24 inch monitor at full res with all dx 11 features maxed either. No new card with a new feature ever did or ever will. But people can buy a 5870 get the monitors and enjoy gaming on them. Then in the future they can buy another 5870 or they can get a 6870 or they can get two fermis or what have you. But at least they were able to jump in and jump in at half the cost (mabye even less) than what nvidia's dx 11 tri screen gaming would set them back.

As for 3d. I don't really know. I don't actually see 3d working well on three screens. I mean how do you see the left screen with both eyes through the lenses to get the 3d effect ? Do i have to turn my head to fully face it ? I've never been a big fan of 3d gaming. Nvidia's solution gives me headaches after 15-20 mins. The 3d monitors look like crap and even avatar started to give me a headache twoards the end.

That is my personal issues with the tech. I'm sure others love it and if they can make it so it din't hurt my head i'd be more interested in it. Hopefully future verisons will work better.


Uhhh maybe in that link where is says "BUY NOW"? What does that have to if non-reference boards are avaliable from Nvidia or not?

Have a nice day in your world of ATI fandom and rage though.

Why are we comparing dx 11 cards to dx 10 cards. Thats just silly. I'm sure 4 295s can beat the fermi when it comes out. Whats it matter ?

THe point is that at $380-$400 the 5870 has been the best card out since sept and the only choice one would make when buying a brand new card unless they owned a g285 already and just went sli.

The fact that it has eyeinfinty and dx 11 , uses less power in idle than other cards even close to its performance and many times less power in load than those makes it the defacto choice.

In fact at this moment in time there is no compelling reason to buy anyhting but an ati dx 11 card. Hopefully in late march nvidia starts to change that.
 
Last edited by a moderator:
Why are we comparing dx 11 cards to dx 10 cards. Thats just silly. I'm sure 4 295s can beat the fermi when it comes out. Whats it matter ?
Someone tried to say that ATI was the only one with non-referenced video cards. That link was proof that that's bullshit. Welcome to the conversation, try reading the thread before jumping in.

As for 3d.
I agree 3D is nearworthless, however we weren't talking about it's viability. Read the thread.

<hotlinked image>
Don't hotlink. It's against the rules in this section.
 
Read the reviews, the G92 despite being touted as "old tech" still keeps up with the 5770 just fine. Infact, in several case it was faster.

Read the reviews. The G92, despite being touted as "old tech", gets its ass handed to it by the 5770. In fact, in all cases it was slower than the 5770. The 5770 competes with the GTX 260 - sometimes a bit faster, sometimes a bit slower, but it destroys the G92.

Perhaps you meant 5750? But hey, why bother with "facts" when you can just mindlessly rant, amirite? :rolleyes:
 
Read the reviews. The G92, despite being touted as "old tech", gets its ass handed to it by the 5770. In fact, in all cases it was slower than the 5770. The 5770 competes with the GTX 260 - sometimes a bit faster, sometimes a bit slower, but it destroys the G92.

Perhaps you meant 5750? But hey, why bother with "facts" when you can just mindlessly rant, amirite? :rolleyes:

If it makes them sleep better at night, I say go ahead. It is evident Nvidia fanboys have kept their frustrations bottled up for far too long. Facts and reason get in the way of their venting and ranting. Expectations and hype for this card have gone unchecked due to the endless delays. I'm afraid in the end it will not live up to them.
 
Last edited:
Read the reviews. The G92, despite being touted as "old tech", gets its ass handed to it by the 5770. In fact, in all cases it was slower than the 5770. The 5770 competes with the GTX 260 - sometimes a bit faster, sometimes a bit slower, but it destroys the G92.

Perhaps you meant 5750? But hey, why bother with "facts" when you can just mindlessly rant, amirite? :rolleyes:

I meant 5750. But why bother writing anything. There is nothing in this forum but fan boys. There is no meaningful discussion going on in here anymore. The cesspool is worse the soapbox.
 
Now ATI launched a shot at Nvidia last year by drastically cutting price. It would be very interesting if Nvidia launched a shot this year. Could you imagine the chaos that would insue if the GF100 launched at a loss/breakeven price of 350$ or even 300$?

I somehow doubt that will happen unfortunately. Traditionally, Nvidia have higher prices then ATI even when they don't perform as well. Nvidia starts out bad this year. ATI's 5000 series is already on the market and will have been on the market for 6 months if Fermi launches in March. Nvidia's current offerings are underperforming and overpriced vs. the competition.

Nvidia have stopped production of Intel MB's due to the lawsuit from Intel.

They lost against Rambus, which means they are in danger of paying a lot of money to them. Worst case, they get an import ban:

His decision, which is subject to review by the full commission, may result in a ban on imports of Nvidia chips and products that use them, including some computers made by Hewlett-Packard Co
http://www.businessweek.com/news/20...in-trade-fight-against-nvidia-over-chips.html

AMD is on a roll now. They are refreshing their entire line of GPU's in H2. There are speulations that there are two new architectures coming:
http://www.xbitlabs.com/news/video/...ck_for_the_Second_Half_of_2010_AMD_s_CEO.html

In addition, they are coming out with new tech like the APU. They got a lot of money from Intel lawsuit and better yet, they won so they could go fabless.

Fermi, though it has a lot of promise, have only shown one benchmark so far that has been witnessed by audience and thats Farcry 2 bench. 1.2X a 5870 in a bench they selected themselves (and probably made sure was supported well in the drivers) doesn't give much headroom. Worst case scenario is that Fermi turns out to be a benchmark queen like the 2900 from ATI, where it beats the competition in benches, but are sorely lacking in real games:
http://www.hardocp.com/article/2007/05/13/ati_radeon_hd_2900_xt/17

Even if Fermi would turn out to be great (which I hope), it still needs to perform well against ATI's new cards in H2. Nvidia might be forced to charge less for their cards then what they wanted or needed, but I doubt they would go less then what is absolutely nessesary in order to stay competitive.
 
Ati is just going to continue gaining market share wit hthe 5x00 series. Lets face it the 5x00series is going to be just like the g80 for a long time to come.

It can't be compared to the G80 until it dominates the market for the ridiculous amount of time that the G80 reigned. The G80 reigned for so long because the 2900 was late and when it was introduced it was a benchmark queen. We won't the answer until after Fermi is benchmarked.

I didn't quote your entire post because frankly, it's written from a limited point of view as if you've only been gaming for a few years. ATI bounced back, which everyone expected. No matter how Fermi performs, nVidia will bounce back.
 
Last edited:
But we Didn't hear it or even think of it when 3dfx Voodoo5 was late and slow and they had a Voodoo5 6k that was big, fast, expensive, and used enough power to require an external power brick :)
Actually very similar discussions occurred back then.
Fermi has been praised to the hilt and blasted to the pit. It will be very interesting to see how it turns out.
 
So basically from what I read, Fermi will be much more expensive for Nvidia to manufacture than ATI. At $208 per Fermi chip, it is at least twice as expensive as a Cypress chip. So profit margins will always be better for ATI and they can afford to undercut Nvidia's prices, whereas Nvidia pretty much has to price their cards high to remain profitable.

According to our sources, nVidia achieved a 25% yield with A2 silicon, coming up with around 24-26 functional dies per wafer. This puts the cost of a single chip into the $208 range, i.e. you can get 4.2 billion transistors [2x Cypress, i.e. HD 5970] for the price of a single three billion die and buy the cooling and the PCB. In case of nVidia, you still need to build the whole enchilada around the $208 piece of silicon.

Additionally Nvidia CANNOT afford to give an Eyefinity equivalent feature for Fermi, otherwise they would be killing their highly profitable Quadro line, whose only unique feature is professional multimonitor ability. In fact the article speculates they will be using Quadro and Tegra sales to subsidize the expense of Fermi.

So in the end you get a card that should be faster yes, but won't be the best bang for buck, and won't have an answer for Eyefinity. I suppose they still have that PhysX thing going for them. :)

Fermi should beat the 5870, yes, but will it beat 2x 5850? And if you can buy a pair of 5850 for the price of one Fermi and have them run faster AND give Eyefinity, why would you buy Fermi? Just pondering out loud. Most of this is speculation right now, but the article makes it clear that Fermi is at least twice as expensive to produce per GPU core than Cypress.
 
No matter how Fermi performs, nVidia will bounce back.

Without any doubt. :)

Something good might come out of this even if Fermi performs bad or performs well and it gets knocked down by some ATI refreshes. Stronger competition against Nvidia has shown in the past that its been good for consumers. Lower prices and incentive to develop better hardware. As you remember, ATI's 4000 series brought prices down on Nvidia cards very fast. Not so good for Nvidia, but good for us consumers. :cool:

I hope that Fermi delivers though, because its always good to have choices.
 
Without any doubt. :)

Something good might come out of this even if Fermi performs bad or performs well and it gets knocked down by some ATI refreshes. Stronger competition against Nvidia has shown in the past that its been good for consumers. Lower prices and incentive to develop better hardware. As you remember, ATI's 4000 series brought prices down on Nvidia cards very fast. Not so good for Nvidia, but good for us consumers. :cool:

I hope that Fermi delivers though, because its always good to have choices.

Exactly :D.
 
Additionally Nvidia CANNOT afford to give an Eyefinity equivalent feature for Fermi, otherwise they would be killing their highly profitable Quadro line, whose only unique feature is professional multimonitor ability. In fact the article speculates they will be using Quadro and Tegra sales to subsidize the expense of Fermi.

What are you talking about? Nvidia has an answer for Eyefinity, although it will be expensive since you have to use SLI.
 
If any user is going to buy INTO the whole triple display and 3d gaming package they will have 3 choices.
1. Eyefinity: Works out of the box with a single card, no special monitors required, no special proprietary 3d glasses, not limited to a particular size of monitor.

2. Current Nvidia owners: Buy another card to SLI + Expensive high refresh monitors + expensive proprietary 3d glasses. Caveats are sticking with old technology missing out on the maximum eye candy on the fresh batch of DirectX 11 games coming this year and being stuck with 24" monitors.

3. Customers waiting for Fermi: Buy 2x Fermi, buy power supply rated to handle them, buy 3x 120hz monitors limited to 24" and buy expensive proprietary 3d glasses.
1. You might need a Displayport-to-DVI adapter for Eyefinity (an extra $100), while Nvidia works with three DVI displays. You can put that $100 you saved towards your second card. If you want 3D on eyefinity, you'll still need 120Hz monitors for it to work right, same as nvidia. Neither solution is limited to a certain size monitor, though there aren't as many 120Hz displays available as 60Hz displays.

2. Current SLI owners already have a second card + Eyefinity will need 120Hz monitors to do 3D as well + Eyefinity will need 3D glasses to do 3D as well. Both solutions are stuck with 22" or 24" monitors until monitor makers release bigger 120Hz screens.

3. So buy two cheaper Fermi parts that perform roughly the same as one of the uber-highend ones to make up the price difference. These two slower cards wont have power requirements as drastic as two of the uber-highend ones either, so a PSU wont be as big a worry. Eyefinity will need 120Hz monitors and 3D glasses as well...

Disclaimer: there are rumors that 3D on eyefinity might be allowed to work on 60Hz monitors. That means each eye only gets an effective 30Hz, which is absolutely HORRIBLE. There's a very good reason Nvidia set the bar at 120Hz, it's because 60Hz is a flickery mess.

For a user who does not care about 3d gaming and is only looking to experience triple display gaming Eyefinity is an even better choice.
1. Eyefinity: Works out of the box and not limited to 24" displays. Works with a single card. Can upgrade performance down the road by crossfiring a second card but then comes the implied cost of possibly needing a new PSU to handle them.

2. Current Nvidia user: Buy a second card to SLI and possibly a new power supply to handle them. They'll still have to live with the same caveats of old tech, not being able to experience DirectX 11, etc. It is not known if their version of Eyefinity is limited to a certain display size so I'll assume it isn't. Current SLI owners are in the best position in this regard, all they lose is DirectX 11 but would still have to buy 2 more monitors. That may be the only saving grace for Nvidia's eyefinity since owners of AMD's last generation tech are forced to upgrade to the next generation to experience it.

3. Users waiting for Fermi: Buy 2x Fermi and the power supply required to run them and 2 more monitors. Again, it is not known if their version of eyefinity is limited to a certain display size.
1. Nvidia's solution works out of the box as well, and since you aren't using 3D glasses, you can use any normal 60Hz monitors, so there's no size limitation.

2. You're forgetting people like me, who already have 3 monitors and one Nvidia card who are running triplehead with either a TripleHead2go or SoftTH. All I need right now is a second GTX260 (which cost around $185 new) and I'm ready to go. If I were to go with an HD5870, it would cost me $400 for the card and another $100 for an adapter so i could use my existing DVI monitors. You do the math, $185 is a hell of a lot cheaper than $500.

3. Once again, you can buy two of the lower-end Fermi and be far more likely to get by with your current PSU. This also puts costs more in line with higher end single cards.

In all scenarios AMD's solution is more accessible due to lower price and less restrictions and offering more flexibility.
Excuse me? It would cost me almost 3 times more money to switch over to Eyefinity. There are a lot of Nvidia users with TripleHead2Go's that are in the same boat.

The money one saves from not buying a second card to SLI can be used instead to buy bigger displays. Also all users opting for option 3 are much smaller in number since the cost of that setup is far greater than any of the other choices.
The money I saved by not buying a Displayport-to-DVI adapter alone, not even counting the cost of an HD5870, pays for more than half of my second GTX260. Also, you don't have any clue on pricing at the moment, it's possible that two low-end Fermi that perform the same as an HD5870 in SLI will cost roughly the same as an HD5870 (perhaps slightly more, because there's nolonger a need for any sort of active adapter).

You can spin this any way you want, really. We simply wont know until the cards are officially released and priced.
 
1. You might need a Displayport-to-DVI adapter for Eyefinity (an extra $100)...

What? I just bought one for $7.99. The most expensive one i could even find is $29 and thats a 10ft cable with displayport on one end and dvi-d on the other. Don't lie.
 
What? I just bought one for $7.99. The most expensive one i could even find is $29 and thats a 10ft cable with displayport on one end and dvi-d on the other. Don't lie.

Hmm if your concidering going Eyefinity, Ide make sure that 8$ adapter you bought is an active one and not passive. At that price, im pretty sure its the latter though..unfortunately for you
 
What? I just bought one for $7.99. The most expensive one i could even find is $29 and thats a 10ft cable with displayport on one end and dvi-d on the other. Don't lie.

You need an active adapter to use 3 monitors with eyefinity if you don't have a displayport monitor.

This is because DVI and HDMI monitors recieve information from TMDS signals (Transition Minimized Differential Signaling). There are only 2 TMDS signal generating chips inside all the new 5000 series ATI chips. Thus you can have 2 monitors that use TMDS singals active at once, 2 dvi, 1 hdmi+1 dvi, or 2 hdmi (if someone made a card with 2 hdmi outputs and not following the reference design.)

Dispalyport doesn't TMDS signals instead it is packet based information. Because of this Eyefinity works with 3 or more displays as long as one of the monitors is displayport for it isn't limited by only having 2 TMDS signal generators.

Now if you are doing eyefinity with only 2 screens you can use a cheap passive adapter to convert the displayport connection into a hdmi connection or a second dvi connection. This is because with 2 screens you are still only using 2 TMDS signal generators and the TMDS signal is passed through the displayport adapter which then reroutes the pins to dvi or hdmi. Pretty much the adapter tells the card to send dvi or hdmi through the displayport instead of the traditional displayport signal and the adapter reroutes the pins into normal dvi.

But if you are using 3 screens you need an active adapter which converts the displayport packet based information into a TMDS signal which is then broadcast to the 3rd monitor through the dvi cable.

Currently all active displayport adapters are about $100. ATI Says they will work with partners to reduce the cost to about $50 dollars, but until that actually happens it is just speculation.
 
And the ATI website needs to be more clear about this on there website with the active vs passive adapters.

Here is a pdf showing what you need to do to get eyefinity working with 3 monitors
http://www.amd.com/us/Documents/Display_Connectivity.pdf

Note it says active adapters again and again.

But on the official eyefinity approved adapters it lists both passive and active adapters not mentioning that you can only get 2 screens not 3 without buying an active adapter
http://support.amd.com/us/eyefinity/Pages/eyefinity-dongles.aspx
 
But on the official eyefinity approved adapters it lists both passive and active adapters not mentioning that you can only get 2 screens not 3 without buying an active adapter
http://support.amd.com/us/eyefinity/Pages/eyefinity-dongles.aspx

The "passive" adapters listed there (at least the accell ones) are actually active adapters. They are unpowered, unlike the ones ATi lists as "active", because they only do single-link connections. Accell lists them as "active technology" so apparently they are not just simple pass-throughs like the monoprice ones that eyefinity hopefuls picked up when the series was first released.
 
1. You might need a Displayport-to-DVI adapter for Eyefinity (an extra $100), while Nvidia works with three DVI displays. You can put that $100 you saved towards your second card. If you want 3D on eyefinity, you'll still need 120Hz monitors for it to work right, same as nvidia. Neither solution is limited to a certain size monitor, though there aren't as many 120Hz displays available as 60Hz displays.
or since the majority of people do not have 3 identical monitors they can simply buy one with display port. Dell has nice 23 inch ones with display port along with others.


2. Current SLI owners already have a second card + Eyefinity will need 120Hz monitors to do 3D as well + Eyefinity will need 3D glasses to do 3D as well. Both solutions are stuck with 22" or 24" monitors until monitor makers release bigger 120Hz screens.
As other nvidia fans note a 5870 isn't going to be ideal for 3 monitors and two older nvidia card suddenly will be ? Regardless if you want dx 11 those old nvidias do not work.

What ati needs for 3d is not known yet, we only know that bit cauldren glasses work with 60hz monitors. Though i'm still not buying three monitor 3d.


3. So buy two cheaper Fermi parts that perform roughly the same as one of the uber-highend ones to make up the price difference. These two slower cards wont have power requirements as drastic as two of the uber-highend ones either, so a PSU wont be as big a worry. Eyefinity will need 120Hz monitors and 3D glasses as well...
Only if you want 3d. And we still don't know that they need 120hz monitors. Also the bit cauldren glasses also have a solution for using them on 3d ready tvs. Nvidia's glasses do not.

You can buy two cheaper fermi parts but you can buy 1 single cheaper ati part also.


Disclaimer: there are rumors that 3D on eyefinity might be allowed to work on 60Hz monitors. That means each eye only gets an effective 30Hz, which is absolutely HORRIBLE. There's a very good reason Nvidia set the bar at 120Hz, it's because 60Hz is a flickery mess.
Because nvid'as 120hz isn't a flickery mess also. Both verisons will be gimmicky not only that but the 3d panels are also gimicky

1. Nvidia's solution works out of the box as well, and since you aren't using 3D glasses, you can use any normal 60Hz monitors, so there's no size limitation.
Nvidia requires twice the cards which means twice the money and no nvidia card supports it as of yet. Not only that but you will also need more expensive sli mobos if you don't have it and better power supplys .

2. You're forgetting people like me, who already have 3 monitors and one Nvidia card who are running triplehead with either a TripleHead2go or SoftTH. All I need right now is a second GTX260 (which cost around $185 new) and I'm ready to go. If I were to go with an HD5870, it would cost me $400 for the card and another $100 for an adapter so i could use my existing DVI monitors. You do the math, $185 is a hell of a lot cheaper than $500.
Hope they are 1 gig gtx 260 . But regardless your going to drop $200 on another gtx 260 that is not a dx 11 card ? What you can do is sell your 1 gtx 260 , get a 5850 and adapter and have a dx 11 machine that can use tri monitor gaming right now and end up saving power http://images.anandtech.com/graphs/5850_092909204359/20213.png and produce less heat. And in the future be able to buy a faster second card. Because when you want dx 11 your going to have to dump two useless cards which will be worth even less at that point and if you stick with nvidia you will have to replace them with two more expensive cards. The nvidia was is more expensive even with the adapter in the equation.


3. Once again, you can buy two of the lower-end Fermi and be far more likely to get by with your current PSU. This also puts costs more in line with higher end single cards.

When do these lower end Fermi come out ? How do you know how competitive power wise they are. You do know that a single 5850 uses around the power of your gtx 260 . The 5870 uses less power than the gtx 275 or 285. Your two sli'd fermis are going to use more power anyway you cut it.

Excuse me? It would cost me almost 3 times more money to switch over to Eyefinity. There are a lot of Nvidia users with TripleHead2Go's that are in the same boat.
If you want dx 11 sticking with nvidia and your plan above will cost you alot more. $200 for a gtx 260 now plus then the added cost in power spent and then in the future 2 more fermi boards. Vs $280 for a 5850 and $100 for a powered switch. Not only that but your maxing out your rigs power with the gtx 260 and will be locked in with that $200 purchase . No dx 11 and more importantly newer games will run worse on it in the long run.




The money I saved by not buying a Displayport-to-DVI adapter alone, not even counting the cost of an HD5870, pays for more than half of my second GTX260. Also, you don't have any clue on pricing at the moment, it's possible that two low-end Fermi that perform the same as an HD5870 in SLI will cost roughly the same as an HD5870 (perhaps slightly more, because there's nolonger a need for any sort of active adapter).
Low end fermi's wont come till late june if not later . Also two low end fermis may perform better than a 5870 in sli. However there are times when sli scales for shit and then your forgeting that you've already invest $200 in a dx 10 video card that will net you much smaller returns when you try to sell both of yours in the future when low end fermis will be out. Instead the person with that single 5870 can simply buy another 5870 and have even greater performance than your low end fermi in sli.

I think i'm starting to make the point. I've repeated it a few times because you repeated yours. You can sell your gtx now and get $100-$125 and pay off the adapter and get a 5850 and enjoy great gaming with display port. That is today. Right now or you can wait for a driver at some point to give you tri gaming on your gtx 260s which may lack the horse power or ram to play at those settings . you can wait till late june or even later to upgrade to two cheap fermi's and get very little for your two gtx 260s and mabye as you said get 5870 performance 3-4 months from now out of two cheaper cards.


You can spin this any way you want, really. We simply wont know until the cards are officially released and priced.
What we do know is that nvidia is launching the gts 360 and gtx 380 in late march. Avaliblity of the gts 360 is supposed to get to good levels in April. Gtx is going to be very limited untill late may or early june. Mid to low end fermi's are expected at the earliest in late june.

We don't know what prices are and thats many of ours points. You can be playing right now with a 58x0 series and enjoying multi monitor gaming right now. Yes it may be more initialy than what you want. But with your plan over the long run you will end up paying more because in both cases you are going with low end performance and already cutting off growth by starting at sli for your features. Thats the problem. A single 58x0 user can later buy another one for more performance. A sli'd low end fermi users is shit out of luck. You will have to go through the process of selling two cards and buying two new higher cards for better performance.

Do you see how that works ? At one point an ati user may need faster than dual 5870s or what have you and will have two buy new cards. But it will be much later than the examples you've given of the performance of your cards both current and fictional compared to what we already know of the 58x0 series.
 
Because nvid'as 120hz isn't a flickery mess also. Both verisons will be gimmicky not only that but the 3d panels are also gimicky

3D gaming at 30Hz suck. Doubling the refresh rate to 60Hz is exactly what it needs.


Next you state:
Hope they are 1 gig gtx 260 . But regardless your going to drop $200 on another gtx 260 that is not a dx 11 card ? What you can do is sell your 1 gtx 260 , get a 5850 and adapter and have a dx 11 machine that can use tri monitor gaming right now and end up saving power http://images.anandtech.com/graphs/5...4359/20213.png and produce less heat. And in the future be able to buy a faster second card. Because when you want dx 11 your going to have to dump two useless cards which will be worth even less at that point and if you stick with nvidia you will have to replace them with two more expensive cards. The nvidia was is more expensive even with the adapter in the equation.

Followed by:
Low end fermi's wont come till late june if not later . Also two low end fermis may perform better than a 5870 in sli. However there are times when sli scales for shit and then your forgeting that you've already invest $200 in a dx 10 video card that will net you much smaller returns when you try to sell both of yours in the future when low end fermis will be out. Instead the person with that single 5870 can simply buy another 5870 and have even greater performance than your low end fermi in sli.

Somehow you manage to recommend Crossfire when it suits you and then thrash SLI for sometimes scaling like shit when both products sometimes scale poorly. You make some good points, but stop veering into fan-boy land.
 
3D gaming at 30Hz suck. Doubling the refresh rate to 60Hz is exactly what it needs.
Nvidia's current 3d sucls. I've used it multiple times on a 120hz monitor. For many it still cuases heaaches and still flickers badly


Somehow you manage to recommend Crossfire when it suits you and then thrash SLI for sometimes scaling like shit when both products sometimes scale poorly. You make some good points, but stop veering into fan-boy land.

Both products can scale like shit. I've only mentioned crossfire because the person i'm responding to has no problems using sli. I personaly do not want to use either verison to me its a waste of money. But for some tehy might like it.

Both companys are capable of using 2 or more card together so its not an advantage either has. So when someone trys to turn it into an advantage it has to be pointed out its not.

With ati you can do 3 monitors with 1 card and spend less up front. Yes if you have an older nvidia product it will also let you use 3 monitors with a second card. However you are now spending for less performance and no dx 11. A 5850 is faster than sli'd 260s . The 5870 is faster than the 290 . Two sli'd 260s don't matter in that performance comparison. All you've done is wasted money on a second card that could have went towars the display adapter and sold the original 260 and put it towards a 5850. In the comparison the poster listed two low end fermi's being as fast a 5870. So all you'd have to do is buy another 5850 which in late june when low end ferims start apearing will bel ess than $280 and you will have greater than sli'd low end fermis in your example . Not to mention that when ati puts out another generation of graphics cards based on northern islands you can then sell both 5850s and get a single northern island card and continue with 3 monitor support. With nvidia' they will continue with fermi and you will continue to need two cards to upgrade .

Going down the ati path is cheaper in the long run until nvidia designs are card that can support 3 monitors on its own
 
As other nvidia fans note a 5870 isn't going to be ideal for 3 monitors and two older nvidia card suddenly will be ? Regardless if you want dx 11 those old nvidias do not work.
Who said the 5870 wasn't ideal for 3 monitors? Looks like it's more than fast enough for most games out there.

I'm currently gaming across three monitors (5040x1050) with a single GTX260, and it's handling things alright. Considering GTX260 SLI benchmarks between an HD5850 and an HD5870 (depending on the game), it should work out fairly well.

Because nvid'as 120hz isn't a flickery mess also. Both verisons will be gimmicky not only that but the 3d panels are also gimicky
120Hz (60Hz per eye) is far better than 60Hz (30Hz per eye). Regardless of weather ATi's solution will work at 60Hz or not, you're going to want 120Hz to reduce the flicker and decrease the latency added by low refresh/frame rates


Nvidia requires twice the cards which means twice the money and no nvidia card supports it as of yet. Not only that but you will also need more expensive sli mobos if you don't have it and better power supplys.
Now you sound like a fanboy. Since when are ALL dual-card solutions more exspensive than ALL single card solutions?

Also, you don't necessarily need an SLI motherboard to run SLI. You're forgetting dual-GPU cards like the GTX295, which have 3 fully functional DVI ports on the back. A single GTX295 will most likely be able to handle this new SLI mode and span games across 3 monitors.

Oh look at that, Nvidia has a single-card solution as well, and it doesn't require an SLI motherboard to work either! Maybe that's why they're planning a dual-GPU Fermi ;)

Hope they are 1 gig gtx 260 . But regardless your going to drop $200 on another gtx 260 that is not a dx 11 card?
The GTX260 has 896mb of video RAM, so yes, very close to 1GB. Also, I don't necessarily need a second GTX260, a single one seems to be handling triple-monitor gaming just fine so far (using SoftTH). Two in SLI would be faster, and it would be an official Nvidia solution.

What you can do is sell your 1 gtx 260 , get a 5850 and adapter and have a dx 11 machine that can use tri monitor gaming right now and end up saving power and produce less heat. And in the future be able to buy a faster second card.
Why would I sell my current GTX260? If I were to get a 5850 or 5870, I would keep the GTX260 around as a PhysX card.

As for being able to buy a second card and go crossfire, no-can-do. I have an SLI motherboard, so I would have to blow about $180 to get an AMD chipset motherboard with the same feature set. That cost to switch to ATi just keeps piling up.

Because when you want dx 11 your going to have to dump two useless cards which will be worth even less at that point and if you stick with nvidia you will have to replace them with two more expensive cards. The nvidia was is more expensive even with the adapter in the equation.
Careful, you're starting to sound like a fanboy again. Two GTX260's in SLI are hardly "worthless", and in fact, that kind of setup can keep pace with the HD5850 and HD5870 in DX9 / DX10 titles.

You do make a good point about cost effectiveness, though. The HD5850 and HD5870 are way overpriced, and there's the premium for a DVI adapter to worry about. You still need that damn DVI adapter even if you crossfire two HD5770's, which have enough DVI ports between them.

Fermi cards aren't out yet, and are going to cost quite a bit as well (unless they can launch a lower-end part specifically designed to be SLIed for this purpose, perhaps a dual GPU card). Fermi pricing will keep ATi's 58XX series prices high as well, which means everything will remain overpriced.

The only thing that looks anywhere near reasonable, if you already have one Nvidia card and want to move to triple-monitor right now, is dropping in a second one and waiting for about 10 months for a refresh so that prices will come down. Sure, that leaves you with two DX10 cards to sell off, but they're still very fast DX10 cards, you should still be able to sell them for a decent price (considering how much use you've gotten out of them).

How do you know how competitive power wise they are.
[...]
Your two sli'd fermis are going to use more power anyway you cut it.
Wait, you just questioned how anyone could know their power requirements, then outright stated that two SLIed Fermi's will use more power no matter what? Pick an argument man...

If you want dx 11 sticking with nvidia and your plan above will cost you alot more. $200 for a gtx 260 now plus then the added cost in power spent and then in the future 2 more fermi boards. Vs $280 for a 5850 and $100 for a powered switch.
Obvious fanboy is obvious. You rounded the price of the GTX260 up and you rounded the price of the HD5850 down to make it look better. Give me a break.

According to current Newegg pricing, the GTX260 is $185 and the HD5850 is $290 (before tax and shipping).

Getting an HD5850 would cost me $290, plus another $100 for an active adapter, and I'll be keeping my current GTX260 as a PhysX card instead of selling it so that wont recoup any costs for me. I will not be able to add a second HD5850 later without getting a new motherboard, and getting one that matches the features of my current board would be another $180 in addition to the price of a second HD5850. In the end, that's $860 spent for me to (eventually) upgrade to HD5850 crossfire.

Alternatively, I could just drop in a second GTX260 for $185, enjoy that setup for most of the year, and wait for some midrange Fermi cards come out so I can SLI them on the cheap. No need for a new motherboard either.

By the way, it sounds like you're comparing two high-end Fermi's (GTX380's) in SLI to an HD5850, and complaining that the Fermi's will cost more...no shit they'll cost more, Fermi SLI will be a faster setup.

I'll be in the market for two of their midrange ($225 a pop) Fermi cards, whenever they come out. Considering two current $225 Nvidia cards in SLI can keep up with the HD58XX series, the new crop of midrange cards should be even faster. When such cards appear, I'll sell off my two GTX260's for whatever I can get and upgrade to two $225 Fermi's.

Lets say I sell each GTX260 for $80, that's that's $225x2 for the new midrange Fermi cards minus $160. Total cost for the upgrade? $290. And it will most definitely be a faster setup than a single $290 ($390 with DVI adapter) HD5850.
 
Last edited:
Who said the 5870 wasn't ideal for 3 monitors? Looks like it's more than fast enough for most games out there.

I'm currently gaming across three monitors (5040x1050) with a single GTX260, and it's handling things alright. Considering GTX260 SLI benchmarks between an HD5850 and an HD5870 (depending on the game), it should work out fairly well.
The person i was replying too.

GTX 260 will fall be hind it only has 896 megs of ram and it uses it less efficently than ati cards . As you icnrease in resolution the extra 104 megs will make a diffrence.


120Hz (60Hz per eye) is far better than 60Hz (30Hz per eye). Regardless of weather ATi's solution will work at 60Hz or not, you're going to want 120Hz to reduce the flicker and decrease the latency added by low refresh/frame rates

Nvidia's solution still is not ideal and its only going to work on your gefoce cards. Ati on the other hand went with an open standard that is also supported on 3d TVs . So you can buy one pair of glasses and use it across the diffrent displays you own.

Now you sound like a fanboy. Since when are ALL dual-card solutions more exspensive than ALL single card solutions?
I never claimed that. Read who I was replying too.

The person specificly mentioned his gtx 260 and that he can buy another one to get 3 screens. In which I replied spending another $200 will only net you dx 10 to get dx 11 you will have wasted that $200 and will be forced to buy two more dx 11 nvidia cards. Instead he could sell his gtx 260 and put the profit twoards a display adapter for one of his 3 monitors and put the second $200 he'd spend plus 80 more towards a single 5850.

The 5850 would use less power , offer similar performance or greater performance when sli doesn't scale right and he can still at a later date add a second card for even greater performance. An option not avalible for the gtx 260 since it already be in sli and it would cost even less than his original option because he'd already be on dx 11 with a much smaller investment.

Its very simple , if you follow the posts you'd understand.

Also, you don't necessarily need an SLI motherboard to run SLI. You're forgetting dual-GPU cards like the GTX295, which have 3 fully functional DVI ports on the back. A single GTX295 will most likely be able to handle this new SLI mode and span games across 3 monitors.

Which has nothing to do with who i'm responding too and currently a single gtx 295 is more expensive and uses much more power than as ingle 5870 while still manging to be slower.

Not only that but you'd still be limited to dx 10. You can sell that gtx 295 and put that towards a single 5870 that isn't hindered by poor scaling multi card performance and is faster in 99% of the games and has dx 11 and uses less power. You can then if you want buy a second 5870 down the line. To get to dx 11 on nvidia you'd have to sell your dual card gpu and buy a dx 11 card but depending on when nvidia gets another dual gpu card out you'd have to go sli to get 3 monitors. Kinda sucks huh.




Oh look at that, Nvidia has a single-card solution as well, and it doesn't require an SLI motherboard to work either! Maybe that's why they're planning a dual-GPU Fermi ;)
Sweet where is it offical announced and how much is it ? Oh wait its not announced and there is no price. Yay !

The GTX260 has 896mb of video RAM, so yes, very close to 1GB. Also, I don't necessarily need a second GTX260, a single one seems to be handling triple-monitor gaming just fine so far (using SoftTH). Two in SLI would be faster, and it would be an official Nvidia solution.
Thats great for you. Glad you like it , not what i was responding too.


Why would I sell my current GTX260? If I were to get a 5850 or 5870, I would keep the GTX260 around as a PhysX card.
Now now we both know nvidia will do everything they can to get it to stop working for you

As for being able to buy a second card and go crossfire, no-can-do. I have an SLI motherboard, so I would have to blow about $180 to get an AMD chipset motherboard with the same feature set. That cost to switch to ATi just keeps piling up.

No you wouldn't. But good try.


Careful, you're starting to sound like a fanboy again. Two GTX260's in SLI are hardly "worthless", and in fact, that kind of setup can keep pace with the HD5850 and HD5870 in DX9 / DX10 titles.

Of course its worth less. As you said dx 9 and dx 10. not dx 10.1 or dx 11 , not in tri monitor gaming either. And they will use more power ! In fact buying a second dx 10 geforce card at this point is a dead end and not a wise solution in the least. At that point you really are a fanboy who will blindly buy anything.


You do make a good point about cost effectiveness, though. The HD5850 and HD5870 are way overpriced, and there's the premium for a DVI adapter to worry about. You still need that damn DVI adapter even if you crossfire two HD5770's, which have enough DVI ports between them.
Way over priced ? You said above the 5870 is faster than sli gt 260s which will set you back close to $380. The retail price of a 5870. Poor you on that.

Good try though.

Fermi cards aren't out yet, and are going to cost quite a bit as well (unless they can launch a lower-end part specifically designed to be SLIed for this purpose, perhaps a dual GPU card). Fermi pricing will keep ATi's 58XX series prices high as well, which means everything will remain overpriced.
The 58xx cards compared to both ati's and nvidia's line ups are not over priced. They offer the best performance of their price brackets.

The only thing that looks anywhere near reasonable, if you already have one Nvidia card and want to move to triple-monitor right now, is dropping in a second one and waiting for about 10 months for a refresh so that prices will come down. Sure, that leaves you with two DX10 cards to sell off, but they're still very fast DX10 cards, you should still be able to sell them for a decent price (considering how much use you've gotten out of them).
Yet they are only dx 10 cards and there are faster dx 11 cards out.

The reasonable thing is to sell the fast dx 10 card now while you can still get a decent return on your investment and put that money towards a ast dx 11 ati card.That way you can play all the dx 11 games as they come out and enjoy tri monitor gaming for months before fermi even comes out. Both of which are pricesless experiances .


Wait, you just questioned how anyone could know their power requirements, then outright stated that two SLIed Fermi's will use more power no matter what? Pick an argument man...

We already know the size of fermi and that its hotter than ati's cards.Even scaling down it will still require similar power draw to ati's cards but now you need two of them for tri monitor gaming. Very simple logical conclusion.


Obvious fanboy is obvious. You rounded the price of the GTX260 up and you rounded the price of the HD5850 down to make it look better. Give me a break.

Just searched for a gtx 260 on newegg and grabbed the first price i saw which was $200. Now you can deny that its 200 but i will just post the link obvious fanboy.

According to current Newegg pricing, the GTX260 is $185 and the HD5850 is $290 (before tax and shipping).
http://www.newegg.com/Product/Produ...gtx_260_core_216_896mb-_-14-130-398-_-Product

There is actually one that is $230 if you'd like me to link you to it


Getting an HD5850 would cost me $290, plus another $100 for an active adapter, and I'll be keeping my current GTX260 as a PhysX card instead of selling it so that wont recoup any costs for me. I will not be able to add a second HD5850 later without getting a new motherboard, and getting one that matches the features of my current board would be another $180 in addition to the price of a second HD5850. In the end, that's $860 spent for me to upgrade to HD5850 crossfire.

That is your choice to keep the gtx 260. I listed a senario where its cheaper in the long run for dx 11 tri gaming and you have yet to show me diffrently. You could try but its very simple. I did this using the original posters arguement. If you want to go back to that post and copy what he stated and try to figure out how my arguement is flawed then do so , but untill then stop adding in diffrent things that suit your needs.

His arguement was he had 3 monitors non had display ports, He has a gtx 260 so buying a second was the fastest way to tri monitor gaming. I countered saying additional $200 is not worth it and while the gtx 260 still has value he should sell it. Take that $100 from the sale and buy the displayport adapter. Take the $200 he would have spent on a second gtx 260 and add $80 . He would get dx 11 features , tri monitor gaming and peformance similar to his two gtx 260s while not having to deal with sli or greater power draw. Total cost in the senario is $80.

For him to buy a second gtx 260 and later to get to dx 11 gaming would cost much more. Because your spending $200 now. Your not going to get $200 return on your investment when low end fermis come out like the poster wanted to use in the future. You'll be lucky to get $100 at that point or less. You'd still need to buy 2 fermis which will cost more than $80 bucks.



Alternatively, I could just drop in a second GTX260 for $185, enjoy that setup for most of the year, and wait for some midrange Fermi cards come out so I can SLI them on the cheap. No need for a new motherboard either.
and miss out on dx 11. Sucks huh. Could have just spent $80 and sold the gtx for a display port adapter and enjoy great performance , dx 11 games and tri monitor gaming months before fermi even launches !!! Yay !

By the way, it sounds like you're comparing two high-end Fermi's (GTX380's) in SLI to an HD5850, and complaining that the Fermi's will cost more...no shit they'll cost more, Fermi SLI will be a faster setup.
Nope never claimed that at all. The poster stated low end fermis. Meaning cost range of $200

I'
ll be in the market for two of their midrange ($200 a pop) Fermi cards, whenever they come out. Considering two current $200 Nvidia cards in SLI can keep up with the HD58XX series, the new crop of midrange cards should be even faster. When such cards appear, I'll sell off my two GTX260's for whatever I can get and upgrade to two $200 Fermi's.
So you'd spend $600 to get dx 11 months down the line. Vs spending $80 now. Good to know. You could buy a second 5850 in a few months when prices drop and spend almost half the price of that to have dx 11 now with a second card for greater performance in a few months

Lets say I sell each GTX260 for $80, that's that's $200x2 for the new midrange Fermi cards minus $160. Total cost for the upgrade? $240. And it will most definitely be a faster setup than a single $290 ($390 with DVI adapter) HD5850.

If you wait even longer till northern islands comes out at the end of the year you can buy two $200 north islands which wil lbe faster than fermis. Oh wow look at that waiting even longer gets you better performance for less.

You know what . Wait till 2015 and i'm sure you can get a really really fast card for $100 that blows away 8 5850s . Wow thats great.

Of course you could sell your gtx 260 and pay for a displayport connector and take that $200 you'd spend on a second fermi and another $80 and buy a 5850. For $80 more than you'd spend on that sli set up and get to enjoy months of dx 11 tri monitor gaming before the low end fermis even hit the market.

Wow so awsome.
 
The person i was replying too.
The person you were replying to was me...

GTX 260 will fall be hind it only has 896 megs of ram and it uses it less efficently than ati cards . As you icnrease in resolution the extra 104 megs will make a diffrence.
BS as far as I can tell, seems to be doing just fine at 5040x1050.

I never claimed that. Read who I was replying too.
Once again, you were replying TO ME.

The person specificly mentioned his gtx 260 and that he can buy another one to get 3 screens. In which I replied spending another $200 will only net you dx 10 to get dx 11 you will have wasted that $200 and will be forced to buy two more dx 11 nvidia cards.
Yes, I did say I could buy another GTX260 for $185 and continue running three screens with improved performance. I can sell both of those cards at a later date so i can upgrade to two mid-range Fermi's, while in the meantime enjoying the GTX260 SLI performance. I don't see how anything there is a waste...

Instead he could sell his gtx 260 and put the profit twoards a display adapter for one of his 3 monitors and put the second $200 he'd spend plus 80 more towards a single 5850.
By "he", you of course still mean ME.

I already said I wouldn't sell my current GTX260. If I did that, I would lose PhysX acceleration, so you can't count on any recouped costs from that.

and he can still at a later date add a second card for even greater performance.
Once again, I'm the same person. I already said I have an SLI motherboard, so no, I CAN'T add a second ATi card without changing my entire motherboard (which increases costs even more).

An option not avalible for the gtx 260 since it already be in sli and it would cost even less than his original option because he'd already be on dx 11 with a much smaller investment.
A smaller investment? Maybe you missed the totals at the end of my previous post.

Single HD5850:
$290 for an HD5850 + $100 for the DVI adapter = $390

Two HD5850's in crossfire:
$580 for two HD5850's + $100 for the DVI adapter + $180 for a new motherboard = $860

Adding a second GTX260 and upgrading to dual mid-range Fermi's later:
$185 for the second GTX260. Buy two mid-range Fermi's for $225 each later on and sell the GTX260's for $80 each. (225 * 2) - (80 * 2) = $290

Its very simple , if you follow the posts you'd understand.
I am following the posts just fine. You're the one who thinks two separate posters are talking to you, when it's been me this whole time... Get it together man.

Which has nothing to do with who i'm responding too
Considering I'm also the one you're responding to, yes, it does...

and currently a single gtx 295 is more expensive and uses much more power than as ingle 5870 while still manging to be slower.
It was an example, it shows that Nvidia can make single card that supports 3 monitors (and in fact, already has, it's just waiting on new drivers to enable it). I'm sure current GTX295 owners are going to be very happy.

Thats great for you. Glad you like it , not what i was responding too.
Yes it was, because you were responding TO ME STILL. How did you miss that the post you quoted and the post responding to you have the same user names?

As for being able to buy a second card and go crossfire, no-can-do. I have an SLI motherboard, so I would have to blow about $180 to get an AMD chipset motherboard with the same feature set. That cost to switch to ATi just keeps piling up.
No you wouldn't. But good try.
I just said I have an SLI motherboard. So yes, I would need a new motherboard to run crossfire...

Of course its worth less. As you said dx 9 and dx 10. not dx 10.1 or dx 11 , not in tri monitor gaming either.
Just because it doesn't support DX11 doesn't mean it's worthless. It's still a very fast card.

Way over priced ? You said above the 5870 is faster than sli gt 260s which will set you back close to $380. The retail price of a 5870. Poor you on that.
I just told you that I already have one, so SLI-ing them would only set me back $185, NOT $380. Learn to read.

The 58xx cards compared to both ati's and nvidia's line ups are not over priced. They offer the best performance of their price brackets.
I consider them overpriced until they manage to get back down to their original MSRP.

The reasonable thing is to sell the fast dx 10 card now while you can still get a decent return on your investment and put that money towards a ast dx 11 ati card.That way you can play all the dx 11 games as they come out and enjoy tri monitor gaming for months before fermi even comes out. Both of which are pricesless experiances .
I already told you I'm already enjoying triple-monitor with a single GTX260. I've been running triple-monitor on this card since before Eyefinity even existed. As for DX11, all the current cards cost too much, I'll be waiting anyway. Might as well wait with a cheap, fast setup that I can sell off later.

We already know the size of fermi and that its hotter than ati's cards.Even scaling down it will still require similar power draw to ati's cards but now you need two of them for tri monitor gaming. Very simple logical conclusion.
So I'll use two mid-range ones that are lower-power than their big brothers. Problem solved.

Just searched for a gtx 260 on newegg and grabbed the first price i saw which was $200. Now you can deny that its 200 but i will just post the link obvious fanboy.
I'm a fanboy for clicking the "sort by price" button? Sure, I'll deny that it's $200, because the cheapest one on newegg is $185. Look, here's a link:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814134077&cm_re=GTX260-_-14-134-077-_-Product

You need to learn to read, seriously...

http://www.newegg.com/Product/Produ...gtx_260_core_216_896mb-_-14-130-398-_-Product

There is actually one that is $230 if you'd like me to link you to it
Now you're intentionally picking the HIGHEST PRICED GTX260's on newegg to compare with? Give me a break fanboy.

That is your choice to keep the gtx 260. I listed a senario where its cheaper in the long run for dx 11 tri gaming and you have yet to show me diffrently. You could try but its very simple. I did this using the original posters arguement.
AUG, I am the original poster. Can't you read?

If you want to go back to that post and copy what he stated and try to figure out how my arguement is flawed then do so , but untill then stop adding in diffrent things that suit your needs.
That would be MY POST still, and I've already shown how going ATi isn't cheaper. You simply aren't reading.

His arguement...
You mean MY argument

was he had 3 monitors non had display ports, He has a gtx 260 so buying a second was the fastest way to tri monitor gaming.
I already said I'm already gaming on tri-monitors, adding a second GTX260 would be a nice performance (and compatibility) improvement though.

I countered saying additional $200 is not worth it and while the gtx 260 still has value he should sell it. Take that $100 from the sale and buy the displayport adapter. Take the $200 he would have spent on a second gtx 260 and add $80 . He would get dx 11 features , tri monitor gaming and peformance similar to his two gtx 260s while not having to deal with sli or greater power draw. Total cost in the senario is $80.
And I already showed how that isn't cheaper, and already said I wouldn't want to sell the GTX260.

For him to buy a second gtx 260 and later to get to dx 11 gaming would cost much more.
Once again, you mean "for ME", and I already showed how it isn't.

Because your spending $200 now. Your not going to get $200 return on your investment when low end fermis come out like the poster wanted to use in the future. You'll be lucky to get $100 at that point or less. You'd still need to buy 2 fermis which will cost more than $80 bucks.
You would be spending $185 now, and when you sell BOTH GTX260's (not just one) you could most likely get close to $100 per-card (I said $80 to be safe). So yes, I could very well get all $200 back.

and miss out on dx 11. Sucks huh. Could have just spent $80 and sold the gtx for a display port adapter and enjoy great performance , dx 11 games and tri monitor gaming months before fermi even launches !!! Yay !
Once again, not selling the GTX260, I want a PhysX card if I go ATi so i don't lose features. So it's FAR MORE than $80 to upgrade.

Also, I'm missing out on DX11 ANYWAY because current cards that are significantly faster than my current one cost far too much in my opinion. So as I said before, might as well wait with a cheap, fast setup until things get more reasonable.

I' So you'd spend $600 to get dx 11 months down the line. Vs spending $80 now. Good to know.
Uh, no, it totals up to around $290 vs. $390 to go ATi because I will not be selling my current GTX260. Get it through your head.

You could buy a second 5850 in a few months when prices drop and spend almost half the price of that to have dx 11 now with a second card for greater performance in a few months
Not without a new motherboard I can't, so stack on another $180 for the motherboard.

Of course you could sell your gtx 260 and pay for a displayport connector and take that $200 you'd spend on a second fermi and another $80 and buy a 5850. For $80 more than you'd spend on that sli set up and get to enjoy months of dx 11 tri monitor gaming before the low end fermis even hit the market.

Wow so awsome.
Not awesome at all, then I lose PhysX. I also need a new motherboard if I want to add a second HD5850 later on, where as with Fermi's, I can keep using my current one. It really is going to be cheaper for me to stick with Nvidia.
 
Last edited:
Unknown-One, pasta4u is a self-proclaimed expert who is trying to be a "forum warrior" :p ... anyone with half a thought to themselves can see how wrong he is in his arguments.
 
The person you were replying to was me...


BS as far as I can tell, seems to be doing just fine at 5040x1050.

Sweet link to benchmark site ?


Yes, I did say I could buy another GTX260 for $185 and continue running three screens with improved performance. I can sell both of those cards at a later date so i can upgrade to two mid-range Fermi's, while in the meantime enjoying the GTX260 SLI performance. I don't see how anything there is a waste...
but but what happened to the card for physx ? You can sell 1 card right now and put the extra money spent on a dx 11 card and enjoy playing dx 11 games with tri monitor gaming right now.


I already said I wouldn't sell my current GTX260. If I did that, I would lose PhysX acceleration, so you can't count on any recouped costs from that.
Of course you wouldn't have to do that if nvidia didn't gimp physx to 1 core .



Once again, I'm the same person. I already said I have an SLI motherboard, so no, I CAN'T add a second ATi card without changing my entire motherboard (which increases costs even more).

You do not need a crossfire board for crossfire. It will work on any sli mobo also. Not only that but you can have both a 5870 and a 5850 and run them both at diffrent speeds for performance increases.


A smaller investment? Maybe you missed the totals at the end of my previous post.

Single HD5850:
$290 for an HD5850 + $100 for the DVI adapter = $390

Two HD5850's in crossfire:
$580 for two HD5850's + $100 for the DVI adapter + $180 for a new motherboard = $860

Math is wrong.

Sell gtx 260 for $100 (mabye more ) Buy $100 dvi adapter. Cost nothing. put $200 for second gtx 260 towards 5850. Pay $80 and enjoy tri monitor gaming with better performance , no sli problems and dx 11 right now. No new mother board required.

Adding a second GTX260 and upgrading to dual mid-range Fermi's later:
$185 for the second GTX260. Buy two mid-range Fermi's for $225 each later on and sell the GTX260's for $80 each. (225 * 2) - (80 * 2) = $290

Wait i can add useless shit like you did also. Second gtx 260 $225 bucks gotta have the best newegg has to offer of course. Two mid range fermis for $300 because thats what they will cost. So your at $825. Sell each gtx 260 for $40 bucks cause no one wants dx 10 boards when mid range fermis finally hit. Oh plus you will need a new power supply and new case to cool and power those fermis. Cost hmmm $2000 - $80 for the two gtx 260s. Final cost $1920

See how easy that was ?




It was an example, it shows that Nvidia can make single card that supports 3 monitors (and in fact, already has, it's just waiting on new drivers to enable it). I'm sure current GTX295 owners are going to be very happy.
Perhaps they will. Sucks that they don't have dx 11 and will require two new video cards to get dx 11 and continue use of a sizable investment in tri monoitor support.



I just said I have an SLI motherboard. So yes, I would need a new motherboard to run crossfire...
No you do not. Read up on crossfire.



Just because it doesn't support DX11 doesn't mean it's worthless. It's still a very fast card.
No dx 11 , no triple monitor support. You'd have to buy another one in which case you are stuck at that performance level and only gane triple monitor support and not dx 11. For $80 more than your investment in another gtx 260 you can buy a 5850 that offers further advancement in speed (cause you can still add another) and enjoy tri monitor gaming .

DX 10 nvidia cards are a dead end . The performance level of gtx 260 might be good. But investing more into the gtx 260 is a dead end and a wrong choice for anyone but nvidia fanboys. the diffrence in your senario to get dx 11 and tri monitor gaming is an $80 investment over the investment in keeping the gtx 260 and buying a second one.


I consider them overpriced until they manage to get back down to their original MSRP
. So are your gtx 260s


I already told you I'm already enjoying triple-monitor with a single GTX260. I've been running triple-monitor on this card since before Eyefinity even existed. As for DX11, all the current cards cost too much, I'll be waiting anyway. Might as well wait with a cheap, fast setup that I can sell off later.
Or you can get a cheap dx 11 tri monitor set up for $80 more than what your planning on doing. I'm failing to understand how an $80 investment for better performance plus the ability to expand the performance more in the future is expensive.

So I'll use two mid-range ones that are lower-power than their big brothers. Problem solved.
if two mid range ones don't use moer than a single big brother .


I'
m a fanboy for clicking the "sort by price" button? Sure, I'll deny that it's $200, because the cheapest one on newegg is $185. Look, here's a link:
the card runs from $185 to $225. I picked $200 which fits right in the middle. Sorry if I try and be fair.


You need to learn to read, seriously...
I know how to read. Thanks.


Now you're intentionally picking the HIGHEST PRICED GTX260's on newegg to compare with? Give me a break fanboy.
Why not you throw in costs that don't exist .



I already said I'm already gaming on tri-monitors, adding a second GTX260 would be a nice performance (and compatibility) improvement though.
and i told you for $80 more than your plan you can be doing it using less power , better peformance and dx 11. But you still instist that dx 11 is over priced right now and that actually buying 2 cards that are already out dated is the right way to go.


And I already showed how that isn't cheaper, and already said I wouldn't want to sell the GTX260.
You have yet to do so. Nor have you given a reason why you'd keep the gtx 260 esp since nvidia continues to disable support for physx on ati hardware.




You would be spending $185 now, and when you sell BOTH GTX260's (not just one) you could most likely get close to $100 per-card (I said $80 to be safe). So yes, I could very well get all $200 back.
a gtx 260 dx 10 hardware with performance equal to $170 dx 11 hardware in late 2010, early 2011 wll get you $80 bucks a pop ? I doubt it very much. No one is going to want hot cards that don't support dx 11 when for the same or less they can get equal performance using less power with dx 11 and tri monitor support.


Once again, not selling the GTX260, I want a PhysX card if I go ATi so i don't lose features. So it's FAR MORE than $80 to upgrade.
physx does not work with ati cards unless you hack and that is only currently . Nvidia will continue to find ways to prevent it from working

Also, I'm missing out on DX11 ANYWAY because current cards that are significantly faster than my current one cost far too much in my opinion. So as I said before, might as well wait with a cheap, fast setup until things get more reasonable.
what is signifigantly faster. The 5850 is much faster than your card and the cardis anywhere from $115 to $75 more expensive than your card.


Uh, no, it totals up to around $290 vs. $390 to go ATi because I will not be selling my current GTX260. Get it through your head.

Sucks for you. Still you can't add.


Not without a new motherboard I can't, so stack on another $180 for the motherboard.
of course you can. Read up on crossfire.

Not awesome at all, then I lose PhysX. I also need a new motherboard if I want to add a second HD5850 later on, where as with Fermi's, I can keep using my current one. It really is going to be cheaper for me to stick with Nvidia.

www.google.com go read up then come back.
 
I think these cost arguments can get very complicated simply because of the sheer number of possible scenarios one has to look at. I'm my case its pretty much going to be new gear in my current mobo, though I may change cases and just get a new mobo, maybe another P6X58D, maybe something else. I do hare rebuilding systems but is not that horriable, it just takes times and sometimes certain things, like game settings are a bitch.
 
Sweet link to benchmark site ?
Check out WidescreenGamingForum, they've benchmarked a bunch of cards at triplehead resolutions.

but but what happened to the card for physx?
Since I would be upgrading to Fermi's, they could handle PhysX, I wouldn't need a separate card for PhysX like I would if I went with an HD5850 or HD5870.

You can sell 1 card right now and put the extra money spent on a dx 11 card and enjoy playing dx 11 games with tri monitor gaming right now.
I already told you, I don't want to lose features. If I sold my GTX260, I would lose the ability to run PhysX in games. I would also lose the ability to run CUDA applications like video encoders and h.264 decoders.

The GTX260 stays, end of story. I'm already running games triple-screen, so that's a non-issue. All DX11 hardware that's significantly faster than my GTX260 is, in my opinion, too expensive, so I'm waiting anyway!

Of course you wouldn't have to do that if nvidia didn't gimp physx to 1 core.
First off, I don't see how that's even relevant. Second, they don't gimp it to one core; if you're running an SLI set, you can share the workload (along with graphics) across the entire SLI set.

You do not need a crossfire board for crossfire. It will work on any sli mobo also. Not only that but you can have both a 5870 and a 5850 and run them both at diffrent speeds for performance increases.
Not according to ATi, their page on crossfire is very clear about what chipsets are supported, and Nforce chipsets are completely absent from the list. Check it out, here's a direct link to their chart.

http://game.amd.com/us-en/content/images/crossfirex/CF_combo_chart.jpg

A smaller investment? Maybe you missed the totals at the end of my previous post.

Single HD5850:
$290 for an HD5850 + $100 for the DVI adapter = $390

Two HD5850's in crossfire:
$580 for two HD5850's + $100 for the DVI adapter + $180 for a new motherboard = $860
Math is wrong.

Sell gtx 260 for $100 (mabye more ) Buy $100 dvi adapter. Cost nothing. put $200 for second gtx 260 towards 5850. Pay $80 and enjoy tri monitor gaming with better performance , no sli problems and dx 11 right now. No new mother board required.
How many times do I have to tell you? I'm not going to sell the GTX260 if I go ATi because I'll lose features. The math there is just fine.

Wait i can add useless shit like you did also. Second gtx 260 $225 bucks gotta have the best newegg has to offer of course. Two mid range fermis for $300 because thats what they will cost. So your at $825. Sell each gtx 260 for $40 bucks cause no one wants dx 10 boards when mid range fermis finally hit. Oh plus you will need a new power supply and new case to cool and power those fermis. Cost hmmm $2000 - $80 for the two gtx 260s. Final cost $1920

See how easy that was ?
I didn't add anything useless... and why would I pick the most expensive GTX260? The $185 one will do just fine.

I also specified that I would be buying mid-range fermi's in the $225 range, not $300.

Why would I ever dream of listing each GTX260 at only $40? I would want to sell each one for at least $80. And why are you adding a case and a PSU when i already have a case and a PSU that will handle the cards? Do you even realize how big of a fanboy you look like right now, adding random items to bloat the price?

1. Second GTX260, $185.
2. Two mid-range Fermi's for (around) $225 each.
3. Sell GTX260's for $80 each when I get said Fermi's.
4. (225 * 2) - (80 * 2) = $290.

It was an example, it shows that Nvidia can make single card that supports 3 monitors (and in fact, already has, it's just waiting on new drivers to enable it). I'm sure current GTX295 owners are going to be very happy.
Perhaps they will. Sucks that they don't have dx 11 and will require two new video cards to get dx 11 and continue use of a sizable investment in tri monoitor support.
They won't have DX11, but they wont have to swap their card just for triple-monitor. They're getting a new feature for free, no room to complain there.

I don't see how they money that goes into buying the monitors factors in here, because even if you go Eyefinity, you'll still need those same 3 monitors.

I just said I have an SLI motherboard. So yes, I would need a new motherboard to run crossfire...
No you do not. Read up on crossfire.
According to ATi, yes I do. Maybe you should read up on crossfire... Lets have another look at that chart, shall we?

http://game.amd.com/us-en/content/images/crossfirex/CF_combo_chart.jpg

Just because it [GTX260] doesn't support DX11 doesn't mean it's worthless. It's still a very fast card.
No dx 11 , no triple monitor support. You'd have to buy another one in which case you are stuck at that performance level and only gane triple monitor support and not dx 11. For $80 more than your investment in another gtx 260 you can buy a 5850 that offers further advancement in speed (cause you can still add another) and enjoy tri monitor gaming.
I'll have triple monitor support when I add a second one, and I'll upgrade to cards with DirectX11 when the prices become reasonable. I don't see how I'll be stuck at all, I'll just sell both the GTX260's and move right on up to Fermi's.

And I don't know where you keep pulling that $80 figure from, I'm not selling the GTX260 if I get an ATi card, so you can't count selling it as a cost-savings on an HD5850.

DX 10 nvidia cards are a dead end . The performance level of gtx 260 might be good. But investing more into the gtx 260 is a dead end and a wrong choice for anyone but nvidia fanboys. the diffrence in your senario to get dx 11 and tri monitor gaming is an $80 investment over the investment in keeping the gtx 260 and buying a second one.
Hardly a dead end, more like transitional hardware. I don't have to be a fanboy to want to save money while not losing features i already have. And you can stop pulling out that $80 figure now, I'm not selling the GTX260, which means it'll cost me $290 for an HD5850 + $100 for the DVI adapter = $390

I consider them overpriced until they manage to get back down to their original MSRP
. So are your gtx 260s
$185 is well below the GTX260's original MSRP.

Or you can get a cheap dx 11 tri monitor set up for $80 more than what your planning on doing. I'm failing to understand how an $80 investment for better performance plus the ability to expand the performance more in the future is expensive.
I'm not going to sell the GTX260, I want to keep PhysX, so it's $390 to switch to ATi, not $80.

I could buy two $275 Fermi's, sell my two GTX260's for $80 each, and pay exactly the same, $390...and it'll almost certainly be faster than a single HD5850.

iI' the card runs from $185 to $225. I picked $200 which fits right in the middle. Sorry if I try and be fair.
That doesn't make any sense dude, why wouldn't you go with the cheapest one? Especially when the partner it's being sold through is Elitegroup, which have fairly good service. It even has an aftermarket heatsink pre-installed on it!

I know how to read. Thanks.
Obviously not. You thought my last couple of posts were two separate people talking to you, and now you keep missing the fact that I WILL NOT sell the GTX260 if I switch to an ATi card for graphics.


Now you're intentionally picking the HIGHEST PRICED GTX260's on newegg to compare with? Give me a break fanboy.
Why not you throw in costs that don't exist.
Uh, no, I'm not.

If I switch to an HD5850, it will cost $290 for the card, and $100 for the active DisplayPort-to-DVI adapter. Where have I added a cost that does not exist? I've even used the price of the cheapest HD5850 on newegg.

You're still trying to compare one of the more expensive brandings of the GTX260 to the cheapest HD5850, which makes absolutely no sense at all. If you want to compare the prices of the most expensive versions of both cards, we can do that. The most expensive HD5850 is $360 against to the most expensive GTX260 which is $225.

That makes the HD5850 $135 more expensive than the GTX260. If we had used the numbers of the cheapest of both cards (you know, the way that actually makes sense), then the difference in price is only $105. You've actually made your argument worse by comparing both of the highest-price cards!

and i told you for $80 more than your plan you can be doing it using less power , better peformance and dx 11. But you still instist that dx 11 is over priced right now and that actually buying 2 cards that are already out dated is the right way to go.
It wouldn't be $80, it would be $390 because I wont be selling the GTX260 to make up costs.

Nor have you given a reason why you'd keep the gtx 260 esp since nvidia continues to disable support for physx on ati hardware.
Uh, yes I have. I'll be keeping it to use as a physX / CUDA card because ATi cards don't support that themselves.

The drivers have been fixed to allow Nvidia + ATi combination, and the same hack has now worked across a number of driver revisions, so it doesn't appear Nvidia is actively attempting to stop the hack from working anymore. Even if they did, I could just...not update the driver and use a slightly older one (shouldn't matter a whole lot since it'll only be doing PhysX and other CUDA apps).

physx does not work with ati cards unless you hack and that is only currently . Nvidia will continue to find ways to prevent it from working
As I just said, the same hack has worked across a bunch of Nvidia drivers. They either can't work out a way to block the ost recent hack, or they simply aren't trying to block it anymore.

what is signifigantly faster. The 5850 is much faster than your card and the cardis anywhere from $115 to $75 more expensive than your card.
The HD5850 is about 20% faster than a single GTX260, true...but then, it would cost me $390 to get one running with my setup.

A second GTX260? Only $185. And that SLi setup will be faster than the HD5850 in DX9 / DX10. Should hold me over until DX11 hardware becomes reasonably priced, at which point I'll sell off both GTX260's.


Sucks for you. Still you can't add.
I can add just fine.

1. Second GTX260, $185.
2. Two mid-range Fermi's for (around) $225 each.
3. Sell GTX260's for $80 each when I get said Fermi's.
4. (225 * 2) - (80 * 2) = $290.

1. HD5850, $290
2. DisplayPort-toDVi adapter, $100
3. $290 + $100 = $390

of course you can. Read up on crossfire.
According to ATi's page on crossfire, you can't. I read up on crossfire, went right to the source, and this was the chart they had showing motherboard chipset compatibility.

http://game.amd.com/us-en/content/images/crossfirex/CF_combo_chart.jpg

Here's the page on ATi's site where the chart is from:

http://game.amd.com/us-en/crossfirex_about.aspx
 
I do apologize for the jumbo-size posts. It's rather hard to correct that volume of misinformation and nonsense in any sort of concise manner.
 
Back
Top