fail?
the way its meant to be played / applies to devs who do their work on green hardware and optimize code along that path
thats what she said
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
fail?
the way its meant to be played / applies to devs who do their work on green hardware and optimize code along that path
Please elaborate.
The Way It's Meant To Be Played. You're not familiar with this...?
dont look at toms, their data isn't legit at all...
You realise I am sure that people say that about every damn site on the net right?
Definitely going to buy 2 5870x2's when they come out and game at 5760 x 2400 on 6 screens with Eyefinity. It's 12 times the resolution of my 1080p tv. Definitely going to be a big upgrade. Here is an article all about it and how Ati is working with Samsung to make ultra thin bezle lcd's : Eyefinity
Definitely going to buy 2 5870x2's when they come out and game at 5760 x 2400 on 6 screens with Eyefinity. It's 12 times the resolution of my 1080p tv. Definitely going to be a big upgrade. Here is an article all about it and how Ati is working with Samsung to make ultra thin bezle lcd's : Eyefinity
nVidia and ATi don't have identical APIs (just like Intel and AMD constantly one-upping each other), so DirectX typically balances out all the kinks.
However, directx emulation is not as fast as actual hardware performance.
So, given a way to render an object, a developer has to choose whether or not to use nVidia's coding or ATi's coding to cut down on render time.
That's simply not true. There are no vendor-specific paths in D3D10/11 nor in HLSL. Only OpenGL features some vendor-specific paths via ARB, and that's only for vertex or pixel shaders. There is no "ATI way" or "NVIDIA way" with D3D10 or greater.So, given a way to render an object, a developer has to choose whether or not to use nVidia's coding or ATi's coding to cut down on render time.
Yes. Start researching a nice 24in-28in 1920x1200 at least.So, judging by the graphs, am I too far off in saying that it would be overkill for gaming at 1680x1050 with a 5870? I've been wanting to upgrade from my 8800GT with one of these lol
Yes. Start researching a nice 24in-28in 1920x1200 at least.
That would be a better choice. We'll know once Kyle and crew post reviews.
I suppose it would depend primarily on what games you play and how you play them. If you're one of those guys who's nuts for AA (like me), the 5870 may have some tangible benefit over the 5850. If you're indifferent about AA, then perhaps something lower in the 58xx line might be more to your liking.So, judging by the graphs, am I too far off in saying that it would be overkill for gaming at 1680x1050 with a 5870?
I suppose it would depend primarily on what games you play and how you play them. If you're one of those guys who's nuts for AA (like me), the 5870 may have some tangible benefit over the 5850. If you're indifferent about AA, then perhaps something lower in the 58xx line might be more to your liking.
I'm a big proponent of buying overkill graphics cards, if you can swing it, since they're generally only overkill for a fairly short period of time.
Any idea what the PCI-E bandwidth requirements will be? From one of the articles it mentioned that it fully saturates the bandwidth on P55 boards. What happens if you go Crossfire, will it negate the performance with 2 5870s?
I assume so since the GTX 295 pretty much uses all of the x16 bandwidth.
I buy as much monitor as I can afford since it can be carried over many system upgrades.I'm a big proponent of buying overkill graphics cards, if you can swing it, since they're generally only overkill for a fairly short period of time.
That's simply not true. There are no vendor-specific paths in D3D10/11 nor in HLSL. Only OpenGL features some vendor-specific paths via ARB, and that's only for vertex or pixel shaders. There is no "ATI way" or "NVIDIA way" with D3D10 or greater.
The days of GPU-specific shader models are thankfully long over. Different architectures excel at different functions, but you'd have to be very aggressive and determinate about which functions you'd use in your shaders to skew performance one way or the other.
I personally am not a fan of any dual gpu solution, I always stick with single high end gpu solution and overclock it to the max stable clock, and be happy with it. I have had my luck with dual gpu setup, I have never been satisfied with the quality of sli or crossfire, because I don't know if it is my luck, everytime I try to play a game that I like it doesn't scale at all, or has issues, and I don't like having to wait for driver updates for a game to work. I am sure some will serve their right to disagree with me, but single powerful gpu is always my preference. anyways I can't wait to see the reviews on this baby.
SLI was a breeze for me...
I think the problem is between the chair and the keyboard....
It's no secret that SLI/CF setups come with their own set of problems. Granted they've come a long way, but they're still not as fool proof and straight forward as a single gpu solution.
Just because you didn't have problems doesn't mean it's user error for everyone else that does. Get over yourself.
no problem with SLI/CF setup at all since 8800 series....
unless the problem is officially known, most of the times are personal problem....
I personally am not a fan of any dual gpu solution, I always stick with single high end gpu solution and overclock it to the max stable clock, and be happy with it. I have had my luck with dual gpu setup, I have never been satisfied with the quality of sli or crossfire, because I don't know if it is my luck, everytime I try to play a game that I like it doesn't scale at all, or has issues, and I don't like having to wait for driver updates for a game to work. I am sure some will serve their right to disagree with me, but single powerful gpu is always my preference. anyways I can't wait to see the reviews on this baby.
That's just icing on the cake, not to mention twice the space, twice the power consumption (if not, then almost), and twice the heat.
My motto is: "Just say no to SLI and Crossfire."
Twice the space: i don't care.
Twice the power consumption: i don't care.
Twice the heat: i don't care (as long as things don't OVERHEAT)
So for me there is only benefits with SLI/Crossfire. (like 80% better performance)