CJ leaks benchmarks of 5870/5850 vs. 285/295

I wanted to avoid spending anything this year to save back some cash but I have a feeling the moment the 5870x2 hits the shelves I'll have one.

Ugh so tempting, might have to wait until they drop in price a bit and grab one around xmas.
 
Definitely going to buy 2 5870x2's when they come out and game at 5760 x 2400 on 6 screens with Eyefinity. It's 12 times the resolution of my 1080p tv. Definitely going to be a big upgrade. Here is an article all about it and how Ati is working with Samsung to make ultra thin bezle lcd's : Eyefinity
 
Please elaborate.


The Way It's Meant To Be Played. You're not familiar with this...?

Nagh...
I'm a little late on this kind of 'lolspeak'.

What I meant was, in a certain timeframe, optimizations are: to make a application run faster on a system by (typically) using hardware (machine) code.

An example would be SSE, MMX, or 3dNow!, and so forth.

nVidia and ATi don't have identical APIs (just like Intel and AMD constantly one-upping each other), so DirectX typically balances out all the kinks.
However, directx emulation is not as fast as actual hardware performance.

So, given a way to render an object, a developer has to choose whether or not to use nVidia's coding or ATi's coding to cut down on render time.
 
You realise I am sure that people say that about every damn site on the net right?

NO!!

All sites are liars!

Just kidding, but you'll never truely get correct data (for you, anyhow), until the card is in your hands. (hopefully with some ESD protection).
 
Definitely going to buy 2 5870x2's when they come out and game at 5760 x 2400 on 6 screens with Eyefinity. It's 12 times the resolution of my 1080p tv. Definitely going to be a big upgrade. Here is an article all about it and how Ati is working with Samsung to make ultra thin bezle lcd's : Eyefinity

at the distance you would be sitting to utilize this which would be around 8 feet, your eyes are not going to be able to perceive the difference between 1080p and 5760x2400

you may be able to sit closer, which of course, makes the argument void.. but even so the human eye can only perceive so much at one time.. either you sit close and lose the ability for your peripheral vision to see the outer screen space, or you sit further away and are unable to distinguish the difference between a 1080p set and 5760x2400

one can't decipher the difference between 1080p and 720p @ 8 feet on 50 inch set.. so imagine all you would be losing



--having said this, if you do buy all that gear.. can I come over and see it? I will let you borrow my 30 inch dell monitor if you can't afford that 6th one. :)
 
Last edited:
Definitely going to buy 2 5870x2's when they come out and game at 5760 x 2400 on 6 screens with Eyefinity. It's 12 times the resolution of my 1080p tv. Definitely going to be a big upgrade. Here is an article all about it and how Ati is working with Samsung to make ultra thin bezle lcd's : Eyefinity

Five grand down the drain.
 
I think it is quite resonable. I paid $2200 for my tv and if you think $300 x 6 = 1800 it's a pretty good deal. I am going to be upgrading my cards no matter and I already have a buyer for both.
 
5 grand for a crap Panasonic 'HDTV' that fails on nearly all HDMI inputs, not to mention horrible workmanship (left a plastic wrapping cover OVER the lamp vent... really? what's up with that!).
 
anyone else wonder if these benchies where done on PII as opposed to i7 system ? If they where on a PII system then the i7 scores are bound to be a touch higher.
 
nVidia and ATi don't have identical APIs (just like Intel and AMD constantly one-upping each other), so DirectX typically balances out all the kinks.
However, directx emulation is not as fast as actual hardware performance.

So, given a way to render an object, a developer has to choose whether or not to use nVidia's coding or ATi's coding to cut down on render time.

Except it doesn't actually work like that. Games don't call nvidia's version or ATI's version, they go through the DirectX API. Sure, some optimizations could help ATI more than Nvidia or vice versa, but there is no "nvidia method" or "ati method" that straight up works better on one vendor vs. another. Generally optimizations that help Nvidia help ATI as well.

Neither Nvidia nor ATI provide their own graphics API, by the way. Both require you to use OpenGL or DirectX. Both Nvidia and ATi spend a ton of time optimizing the crap out of their drivers and both do game specific optimizations, especially for high profile titles like Crysis.

Also, games don't ever run "emulated" DirectX. All the DirectX calls end up going to the drivers. Whether the driver runs it in software or hardware is entirely up to ATI or Nvidia and what their hardware is capable of, not the game developer. I highly, highly doubt that either Nvidia OR ATi is software emulating *anything*.

You should probably stop making optimization, directx, and graphics drivers claims as you've already admitted you don't do them and aren't a game developer.
 
Any idea what the PCI-E bandwidth requirements will be? From one of the articles it mentioned that it fully saturates the bandwidth on P55 boards. What happens if you go Crossfire, will it negate the performance with 2 5870s?

I assume so since the GTX 295 pretty much uses all of the x16 bandwidth.
 
So, given a way to render an object, a developer has to choose whether or not to use nVidia's coding or ATi's coding to cut down on render time.
That's simply not true. There are no vendor-specific paths in D3D10/11 nor in HLSL. Only OpenGL features some vendor-specific paths via ARB, and that's only for vertex or pixel shaders. There is no "ATI way" or "NVIDIA way" with D3D10 or greater.

The days of GPU-specific shader models are thankfully long over. Different architectures excel at different functions, but you'd have to be very aggressive and determinate about which functions you'd use in your shaders to skew performance one way or the other.
 
So, judging by the graphs, am I too far off in saying that it would be overkill for gaming at 1680x1050 with a 5870? I've been wanting to upgrade from my 8800GT with one of these lol
 
So, judging by the graphs, am I too far off in saying that it would be overkill for gaming at 1680x1050 with a 5870? I've been wanting to upgrade from my 8800GT with one of these lol
Yes. :p Start researching a nice 24in-28in 1920x1200 at least.
 
That would be a better choice. We'll know once Kyle and crew post reviews.
 
I know what you mean. You know you're a geek if new Hardware is like early Christmas. :D
 
So, judging by the graphs, am I too far off in saying that it would be overkill for gaming at 1680x1050 with a 5870?
I suppose it would depend primarily on what games you play and how you play them. If you're one of those guys who's nuts for AA (like me), the 5870 may have some tangible benefit over the 5850. If you're indifferent about AA, then perhaps something lower in the 58xx line might be more to your liking.

I'm a big proponent of buying overkill graphics cards, if you can swing it, since they're generally only overkill for a fairly short period of time.
 
I ll be gaming @ 1680x1050 w/ 5870 (probably 2GB version)... even tho i have a GTX 295 a cooler quieter single core card (and new toy) really appeals to me right now :D
 
I suppose it would depend primarily on what games you play and how you play them. If you're one of those guys who's nuts for AA (like me), the 5870 may have some tangible benefit over the 5850. If you're indifferent about AA, then perhaps something lower in the 58xx line might be more to your liking.

I'm a big proponent of buying overkill graphics cards, if you can swing it, since they're generally only overkill for a fairly short period of time.

Yea I loves me some AA and AF. I know it's too early to judge but do you think a stock clock E8500 would need to be OC'd for this? I'm on an Asus P5Q Pro and 4gig of DDR2-1000. Just wanting to get my ducks in line, so to say.
 
Any idea what the PCI-E bandwidth requirements will be? From one of the articles it mentioned that it fully saturates the bandwidth on P55 boards. What happens if you go Crossfire, will it negate the performance with 2 5870s?

I assume so since the GTX 295 pretty much uses all of the x16 bandwidth.

Well I know that a P45 board will not handle Crossfired HD4870's even close to their full potential, so I'm assuming it's the same way with P55 and its second x8 PCI-E slot.
 
I'm a big proponent of buying overkill graphics cards, if you can swing it, since they're generally only overkill for a fairly short period of time.
I buy as much monitor as I can afford since it can be carried over many system upgrades.
 
I'm thinking that the 5870 might be the first card that can fully utilize 2GB, particularly when you're doing triple screens or greater.
 
That's simply not true. There are no vendor-specific paths in D3D10/11 nor in HLSL. Only OpenGL features some vendor-specific paths via ARB, and that's only for vertex or pixel shaders. There is no "ATI way" or "NVIDIA way" with D3D10 or greater.

The days of GPU-specific shader models are thankfully long over. Different architectures excel at different functions, but you'd have to be very aggressive and determinate about which functions you'd use in your shaders to skew performance one way or the other.

I agree, I think it is not the game developers that optimize the game it is ati or nvidia that have to optimize their drivers for a game specific code. so if Nvidia gets special privileges to game code and the game is sponsored by Nvidia the way its meant to be played program(sorry forgot the short form for that) than ofcourse nvidia is going to have the upper hand in performance in that game, just look at how ati has been improving their performance in crysis for months now, and in lost planet as a hd 4870 owner I have seen the performance go up like crazy with the drivers. and I think ATI might have upper hand or great performance from the go for cry engine 3, since it looks like they are working very closely with crytek, and crytek has most likely had hd 5870 cards for a couple of months now since they are faster than nvidia's high end card. and who wouldn't like to have faster gpu.
 
I personally am not a fan of any dual gpu solution, I always stick with single high end gpu solution and overclock it to the max stable clock, and be happy with it. I have had my luck with dual gpu setup, I have never been satisfied with the quality of sli or crossfire, because I don't know if it is my luck, everytime I try to play a game that I like it doesn't scale at all, or has issues, and I don't like having to wait for driver updates for a game to work. I am sure some will serve their right to disagree with me, but single powerful gpu is always my preference. anyways I can't wait to see the reviews on this baby.
 
I personally am not a fan of any dual gpu solution, I always stick with single high end gpu solution and overclock it to the max stable clock, and be happy with it. I have had my luck with dual gpu setup, I have never been satisfied with the quality of sli or crossfire, because I don't know if it is my luck, everytime I try to play a game that I like it doesn't scale at all, or has issues, and I don't like having to wait for driver updates for a game to work. I am sure some will serve their right to disagree with me, but single powerful gpu is always my preference. anyways I can't wait to see the reviews on this baby.

SLI was a breeze for me...

I think the problem is between the chair and the keyboard....
 
SLI was a breeze for me...

I think the problem is between the chair and the keyboard....

:rolleyes: It's no secret that SLI/CF setups come with their own set of problems. Granted they've come a long way, but they're still not as fool proof and straight forward as a single gpu solution.

Just because you didn't have problems doesn't mean it's user error for everyone else that does. Get over yourself.
 
:rolleyes: It's no secret that SLI/CF setups come with their own set of problems. Granted they've come a long way, but they're still not as fool proof and straight forward as a single gpu solution.

Just because you didn't have problems doesn't mean it's user error for everyone else that does. Get over yourself.

no problem with SLI/CF setup at all since 8800 series....

unless the problem is officially known, most of the times are personal problem.... :p
 
no problem with SLI/CF setup at all since 8800 series....

unless the problem is officially known, most of the times are personal problem.... :p

exactly my point...

guess what games scaled the best?

Yep, Crysis, then COD4...

so no, neither of those are "poorly coded"
 
I personally am not a fan of any dual gpu solution, I always stick with single high end gpu solution and overclock it to the max stable clock, and be happy with it. I have had my luck with dual gpu setup, I have never been satisfied with the quality of sli or crossfire, because I don't know if it is my luck, everytime I try to play a game that I like it doesn't scale at all, or has issues, and I don't like having to wait for driver updates for a game to work. I am sure some will serve their right to disagree with me, but single powerful gpu is always my preference. anyways I can't wait to see the reviews on this baby.

That's just icing on the cake, not to mention twice the space, twice the power consumption (if not, then almost), and twice the heat.

My motto is: "Just say no to SLI and Crossfire."
 
That's just icing on the cake, not to mention twice the space, twice the power consumption (if not, then almost), and twice the heat.

My motto is: "Just say no to SLI and Crossfire."

Twice the space: i don't care.
Twice the power consumption: i don't care.
Twice the heat: i don't care (as long as things don't OVERHEAT)

So for me there is only benefits with SLI/Crossfire. (like 80% better performance)
 
Twice the space: i don't care.
Twice the power consumption: i don't care.
Twice the heat: i don't care (as long as things don't OVERHEAT)

So for me there is only benefits with SLI/Crossfire. (like 80% better performance)

QFT :p
 
Back
Top