4870x2 XSP Sideport feature poised to counter rumored 280gx2

Guys he says it exists and it matters, just accept it. Who cares if you can't see it, it's there and you need to be extremely worried or the evil Microstutter fairy will get you in your sleep! :rolleyes:

A n00b with nothing but fallcies :rolleyes:
 
LOL. Seriously if your eyes cant see it you just HAVE to get a utility to "help" you see microstuttering.
 
Who really cares if microstuttering exists? Those same people are the ones who complain about input lag and ghosting on LCD monitors. It exists there too, but guess what? The world keeps spinning and people keep buying. I don't have dual GPU's but I plan on getting the 4870x2. If I am not satisfied with the card I'll return it and explore different options.
 
Using a game like Crysis to show micro-stuttering probably isn't the best idea, especially on a ATI card. We all know that 1) It's an Nvidia sponsored game 2) The game is optimized for Nvidia cards.

Once the driver's are set for XSP that extra bandwidth is going to turn into better performance, hence in all probability, blowing the possible 280gx2 on it's fat butt. The fact that the 280gx2 is probably going to be two GT200bs "glued" together just like the 9800gx2 was two 8800gts g92s "glued" together...the 4870x2 is going to have a definitive edge with the XSP feature. ATI is saving this feature so that it can trump Nvidia's upcoming offerings, the GT200b and the highly unlikely GT200bgx2.
 
ATI is saving this feature so that it can trump Nvidia's upcoming offerings
If that were true, how would you feel as a X2 owner knowing your card is being artificially held back in it's performance?
 
The ATI team hasn't been able to write drivers good enough to take advantage of the sideport yet.
 
The ATI team hasn't been able to write drivers good enough to take advantage of the sideport yet.
Yes, that what I think also. Not them laying in wait for nVidia's response to the X2.
 
If that were true, how would you feel as a X2 owner knowing your card is being artificially held back in it's performance?

I understand what you're saying but personally I wouldn't care because I would already have the fastest video card and would continue to have the fastest video card.

As for microstuttering...how about we just never use Crysis for any benchmark ever again? I don't see microstuttering on my 4870 Crossfire setup. Didn't see it on my X1950 Pro Crossfire or my 3870 Crossfire.

Does that mean it is not there? No.
Does it mean it is there? No.
If it is there and I don't see it do I care? NO.
In the end the eye of the beholder is all that matters.
 
theres very little to begin with. i ahvent even ehard of micro stuttering till a few months ago... and i've been into computer since the radeon 9600 series and a little while before
...oohhhhhh, that's a long time. :D

I've been watching polygons myself starting back on my S3 Trio64V and Diamond Monster 3D card.



As for microstuttering...how about we just never use Crysis for any benchmark ever again? I don't see microstuttering on my 4870 Crossfire setup. Didn't see it on my X1950 Pro Crossfire or my 3870 Crossfire.

My thoughts exactly, I had this micro stuttering on my 8800GTS and it went away when I upgraded my system to my E8400 from my X2 4200. Seems to me, this micro stuttering could as easily be a CPU problem as well as a GPU / driver problem.
 
but personally I wouldn't care
I would care knowing performance was held back. Most would care. It's why the theory it being a thrump card held in check doesn't sound like something ATi would do.
Once it shows benefit to the user, ATi will enable it.
 
If that were true, how would you feel as a X2 owner knowing your card is being artificially held back in it's performance?

I can't feel too bad if it's the best performing card out there regardless. I might feel good that there is some extra performance inside the beast :)
 
Well, that's were we differ. Ati is not holding it back until they see what nVidia does.
They will enable it once it shows improvement.
 
I can't feel too bad if it's the best performing card out there regardless. I might feel good that there is some extra performance inside the beast :)

i have a feeling drivers alone will show us that, i think there is alot of available speed on this crossfire setup that isnt used just yet :)
 
i have a feeling drivers lone will show us that, i think there is alot of available speed on this crossfire setup that isnt used just yet :)
I agree that the drivers will only improve performance.





P.S. For heaven's sake, CanadaOwnsJoo! Get a 30in for those cards! :D
 
I agree that the drivers will only improve performance.





P.S. For heaven's sake, CanadaOwnsJoo! Get a 30in for those cards! :D

had one, felt like i was sitting at the front row of a movie theater, didnt like it much..maybe at the new house if i have more room to move it back :)
 
LOL. Seriously if your eyes cant see it you just HAVE to get a utility to "help" you see microstuttering.


Well the thing is that the effect of microstuttering can be a percieved framerate lower than what FRAPS is displaying. Thus a card which is microstuttering may look better in benchmarks but be worse when you're actually playing it. You need a benchmark to tell you if cards are within 10% of each other in performance, likewise you need another program to tell you if a card is microstuttering.

a quick play through of Crysis

Frame, Time (ms)
1, 0.000
2, 49.646
3, 102.039
4, 155.181
5, 202.662
6, 253.703
7, 324.830
8, 375.051
9, 422.995
10, 470.629
11, 521.823
12, 573.016
13, 622.636
14, 676.204
15, 765.499
16, 815.874
17, 864.960
18, 914.017
19, 964.047
20, 1014.496
21, 1067.056
22, 1118.742
23, 1167.424
24, 1219.598
25, 1272.663
26, 1324.092
27, 1372.593
28, 1421.731
29, 1474.627
30, 1525.802
31, 1572.716
32, 1620.740
33, 1666.353
34, 1714.091
35, 1796.860
36, 1844.414
37, 1896.561
38, 1950.143
39, 2003.122
40, 2051.616
41, 2098.554
42, 2151.660
43, 2203.281
44, 2253.847
45, 2299.740
46, 2345.863
47, 2393.667
48, 2440.401
49, 2487.175
50, 2533.083


I put this through Excel and came up with rendering time between frames as...

49.65
52.39
53.14
47.48
51.04
71.13
50.22
47.94
47.63
51.19
51.19
49.62
53.57
89.3
50.38
49.09
49.06
50.03
50.45
52.56
51.69
48.68
52.17
53.07
51.43
48.5
49.14
52.9
51.18
46.91
48.02
45.61
47.74
82.77
47.55
52.15
53.58
52.98
48.49
46.94
53.11
51.62
50.57
45.89
46.12
47.8
46.73
46.77
45.91

I found the average... FRAPS would have been telling you 19fps at the moment you took that. Now that doesn't appear to have any problems with AFR, though there are a couple of frames in the 80ms (equivelant to 12.5FPS... noting that the game will be TELLING you 19fps).

What I'd like to see is more testing like for several different cards and SLi/Crossfire set ups and in different games. I've seen from one review of crossfired 4870s that it had no microstutter in GRID, but from another review it did have microstutter in other games (can't remember which ones).
 
Many people didn't notice it...untill someone made them aware af the issue..and then they couldn't ignore it.
Same thing with cue marks
Untill made aware of them, you don't "percive" them...but they are there.

I notice the "burn" (cue) marks practically every time they show up when watching a movie at the theater.

But if I suggested to someone to avoid the theaters over this issue, and just wait until the movie ends up on Blu-Ray, I'm pretty confident they wouldn't even know exactly what a "burn" mark is. When I took them to a theater, forced them to watch the top-right corner just waiting for it to appear just so I could go "OHOH! There it is! There it is!", I'm pretty confident even my most mild-mannered friends would go "do I know you?" and the lesser-mannered ones would tell me to "STFU" already...
 
that could just be because of the trisli, it is not a good configuration. its known to actually lower performance in several games. I have been told that it plays pretty good on regular SLI though I have not seen it myself.

and its still cyrsis.
 
I hear there are black holes out in the universe, but no one has seen them with their naked eye. Thus they exist, AND YET THEY DONT AFFECT ME IN ANY WAY SHAPE OR FORM. Stop being a douchebag, just because some program counts a nanosecond difference between frames being displayed doesnt mean that anyone sees it. How many people use multi card setups. Very few. How many of those people ever see a stutter. Even less. Scientifically it MIGHT exist. But maybe 1% of the people who have multicard setups might see it, which is only around maybe 5% of gamers.

STOP BEING A DOUCHE!.

Also, i have had 6 different multicard setups in the last year, and i never PERCIEVE any stutter, thus it does not affect me.
 
GTX 280 Tri-SLI produce high framerates while it was sluttering which mean it does skip some frames. So this is called microsluttering.

I really want to come up for a witty definition of microsluttering, but my creativity fails me.
 
I hear there are black holes out in the universe, but no one has seen them with their naked eye. Thus they exist, AND YET THEY DONT AFFECT ME IN ANY WAY SHAPE OR FORM.

For those of us looking into purchasing a crossfire set up, 4870X2 or 4850X2, it will affect is IF it exists. I'd like to see more testing and graphs before I decide whether or not it exists, which is why I'm pushing the point, so more poeople decide to test the frame rendering times of their systems for me to look at. I'd like to think I'm knowledgable enough to interpret the data, however the data currently available does not seem trustworthy nor thorough enough to come to a conclusion.

If there were more data from sources I trusted, we could conclude whether or not it exists. If it exists, it is something that'll affect new card buyers, as even though it may not give a percievable stuttering, it may very well skew all the benchmarks done for the 4870x2 and SLi/crossfire set ups.

So...

STOP BEING A DOUCHE!
 
World of Warcraft on my 4870x2

rame, Time (ms)
1, 0.000
2, 53.738
3, 71.316
4, 89.064
5, 106.887
6, 123.378
7, 140.344
8, 158.099
9, 175.299
10, 191.949
11, 209.121
12, 227.120
13, 244.200
14, 264.539
15, 282.668
16, 299.962
17, 316.151
18, 333.682
19, 351.922
20, 369.009
21, 386.053
22, 402.664
23, 419.759
24, 437.548
25, 461.131
26, 490.602
27, 507.578
28, 524.762
29, 541.364
30, 558.998
31, 575.850
32, 593.213
33, 611.371
34, 647.136
35, 665.598
36, 681.796
37, 699.093
38, 716.519
39, 734.342
40, 751.384
41, 768.837
42, 785.543
43, 801.909
44, 818.937
45, 835.704
46, 853.136
47, 870.413
48, 887.144
49, 904.822
50, 922.038
 
Last I heard on Anandtech, the features broken and gains are very minimal at best. G'day hype machine..
 
Yeah, thats what I want to see... In that set of data (its about 1sec, as its running at an average of 53fps over 50 frames) most frames are rendered in ~17ms. Again, it doesn't show the AFR style microstutter shown in other reviews, as all frames are rendered pretty much the same time except a few, and its certainly not every alternating frame like some reviews have shown.

This is why I say there is a lack of data to decide.

Those results show ~17ms per frame (59fps) over a time period of ~1 second. However there are 3 frames which are rendered in ~30ms (equivalent to 33fps).

I'd like to see similar data for a few single GPU set ups before I come to any conclusions :) I'm just trying to interpret the data for other people to see what it means.
 
Last I heard on Anandtech, the features broken and gains are very minimal at best. G'day hype machine..

Well according to techpowerup, the official reason is

"The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."

I think ATI is down playing it a bit. My guess is once drivers enable it it will have a nice hefty performance increase. Perhaps they are waiting at the end of the month? Remember how Nvidia came out with physx when the 4870x2 launched? Except that was a bit of a failure. Only time will tell.
 
Well according to techpowerup, the official reason is

"The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."

I think ATI is down playing it a bit. My guess is once drivers enable it it will have a nice hefty performance increase. Perhaps they are waiting at the end of the month? Remember how Nvidia came out with physx when the 4870x2 launched? Except that was a bit of a failure. Only time will tell.

I fail to see what Physx has to do with sideport, why are you mentioning it?

Coming from the comments in Anand's review I seriously doubt you will ever see sideport working on the 4870X2. If not why is it not enabled today after a number of sites chose to include it in there preview and review articles? I would look to the successor of the 4870X2 to perhaps use this feature. It looks like it could be great, but it is a no show currently, I certainly wouldn't be counting on some magical driver release to activate it.

Is XSP even advertised on the box's of 4870X2s?



 
Well according to techpowerup, the official reason is

"The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."

I think ATI is down playing it a bit. My guess is once drivers enable it it will have a nice hefty performance increase. Perhaps they are waiting at the end of the month? Remember how Nvidia came out with physx when the 4870x2 launched? Except that was a bit of a failure. Only time will tell.

Its been 1.5 months, how can any reasonable person come to the conclusion that physx is a failure in view of the time frames that developers will need to incorporate this new technology?!
 
Coming from the comments in Anand's review I seriously doubt you will ever see sideport working on the 4870X2. If not why is it not enabled today after a number of sites chose to include it in there preview and review articles?

"...because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."

- ATI's official response stated in techpowerup
 
ummm its called micro-stuttering and some people don't see it cause is "micro" as in smaller than mini or small or tiny :p
 
Its been 1.5 months, how can any reasonable person come to the conclusion that physx is a failure in view of the time frames that developers will need to incorporate this new technology?!

PhysX is not new technology. It's failed technology that Nvidia wants to push.
 
"...because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."

- ATI's official response stated in techpowerup

http://www.anandtech.com/video/showdoc.aspx?i=3372&p=3

Here is another, more detailed, so called "official" response, seeing as you seem to have selective reading. So you tell me, from this response how do you deduct some massive performance increase if ATI has stated it would only effect minimum frames? I haven't read about some huge minimum frame rate problem on the 4870X2s, are you alert us to such?

"According to AMD the performance impact is negligible, while average frame rates don't see a gain every now and then you'll see a boost in minimum frame rates. There's also an issue where power consumption could go up enough that you'd run out of power on the two PCIe power connectors on the board. Board manufacturers also have to lay out the additional lanes on the graphics card connecting the two GPUs, which does increase board costs (although ever so slightly). AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled. "
 
http://www.anandtech.com/video/showdoc.aspx?i=3372&p=3


"According to AMD the performance impact is negligible, while average frame rates don't see a gain every now and then you'll see a boost in minimum frame rates. There's also an issue where power consumption could go up enough that you'd run out of power on the two PCIe power connectors on the board. Board manufacturers also have to lay out the additional lanes on the graphics card connecting the two GPUs, which does increase board costs (although ever so slightly). AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled. "

Well that quote is certainly different from tech power up's quote below. It's good to get them side by side as one of them is wrong. Since anandtech did make the mistake of making a review and stating the 4870x2 had 512mb x 2 memory....not to mention that their reviews are pretty vague. I think it's safe to say that techpowerup is shown to to be more accurate, so I would go with the quote below regarding sideport, in that it's just a matter of drivers utilizing the feature when the bandwidth is needed.

"One novel feature of the HD 4870 X2 is that it has an additional direct-GPU-to-GPU interconnect called CrossFire Sideport (XSP). The XSP offers an additional 5 GB/s interlink bandwidth between the GPUs but is not enabled at this time. Yes, you read correctly. The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update.

One major advantage of the XSP that I see is that transfers between the GPUs would have a lower latency. The Gen2 PCI-E bridge will certainly be fast, but it will take a short time to process the incoming data and send it out to the other GPU (<140 ns). With the XSP's point-to-point interlink this delay is eliminated. So my speculation is that AMD is working on using the XSP feature but the driver support simply doesn't work as intended at this time."

This statement given to techpowerup totall contradicts what was said in anandtech. anandtech has made plenty of mistakes before, so perhaps they made another mistake.
 
All good points, but I'm a skeptic. If it works, awesome. If it doesn't, I called it anyway.
 
Originally Posted by Forceman View Post
I really want to come up for a witty definition of microsluttering, but my creativity fails me.
A midget with a speech impediment?
Reply With Quote

2 midgets in a 3 legged race that can't get their legs syncronized. Win.
 
PhysX is not new technology. It's failed technology that Nvidia wants to push.

It's new technology in the sense that physics can now be calculated on a GPU without the need for a dedicated and costly PPU.

It failed in the form concieved by Ageia because no one was prepared to fork out money for another add-in card when there were no killer apps which took advantage of it, the paradox being that developers were not prepared to develop a killer app because people weren't buying PPU's.

What nvidia has done has now changed the landscape for Physx, mass adoption of PPU's is no longer a barrier to developing games which harness it's power because anyone who has a modern nvidia GPU can take advantage of Physx. The fact that quite a few new games are implementing Physx is a testament to this fact.

To make a blanket statement that Physx is dead technology is short sighted and misconceived.
 
Back
Top