4870x2 XSP Sideport feature poised to counter rumored 280gx2

But if he doesn't notice it, it doesn't matter, does it?

You come flaming everyone here with your 'evidence' in the form of some graphs, but what's the relevance of those if you're measuring something that's not visually perceivable?

I have a 9800GX2, I should have micro-stutters too, but so far I haven't managed to notice anything, I even tried a few tests and paid extra attention, but none whatsoever.

If it's technically measurable like in your stupid little graphs, I sure believe you. It's there. But I cannot see it, so I don't care. This has nothing to do with ignorance whatsoever. Why should I change my hardware for something I cannot see? Why do these micro-stutter crusaders keep trying to tell everyone not to buy hardware setups that include micro-stutter, when it's actually of no relevance?

QFT, as I have seen a few die-hard users going on and on and on about their little micro-stutter problem. I have a 9800GX2, one of the cards "crippled" by this disease. Yet, I am perfectly happy with my card, because I don't notice any negative affect from this problem. I'm not denying it exists, but I agree with the above that if I (and apparently many others) cannot notice it without deliberately searching for it, it thus does not have a negative affect on our gameplay, and therefore it is acceptable to not care! We might be able to get higher/smoother frames without it, but again if we're not getting owned because of it, and we're still able to ENJOY OUR GAMES, then who the F cares? Let's say that there's a rendering defect in a game that causes a character to have an extra triangle in his/her foot. But becuase of the positioning of the erroneous polygon, only by pausing the game and activating NOCLIP (or using the level editor) can one see this defect. Is it there? YES. Does it matter? NO, if you don't notice it.

Now, if you're one of those people who, simply by knowing about a mistake is offended by it, okay, but please for the love of God stop flaming everyone who doesn't see/doesn't know about/doesn't care about your little micro-stutter!

To recap, so the micro-stutter freaks who only read summaries and see what they want to see can't argue with me:
- Is "microstutter" present in some way that can be measured? YES.
- Is "microstutter" something that every single person perceives? NO.
- Can you accurately measure microstutter with your eyes? NO...but if you cannot detect microstutter, it therefore follows that it does not matter.
 
QFT, as I have seen a few die-hard users going on and on and on about their little micro-stutter problem. I have a 9800GX2, one of the cards "crippled" by this disease. Yet, I am perfectly happy with my card, because I don't notice any negative affect from this problem. I'm not denying it exists, but I agree with the above that if I (and apparently many others) cannot notice it without deliberately searching for it, it thus does not have a negative affect on our gameplay, and therefore it is acceptable to not care! We might be able to get higher/smoother frames without it, but again if we're not getting owned because of it, and we're still able to ENJOY OUR GAMES, then who the F cares? Let's say that there's a rendering defect in a game that causes a character to have an extra triangle in his/her foot. But becuase of the positioning of the erroneous polygon, only by pausing the game and activating NOCLIP (or using the level editor) can one see this defect. Is it there? YES. Does it matter? NO, if you don't notice it.

Now, if you're one of those people who, simply by knowing about a mistake is offended by it, okay, but please for the love of God stop flaming everyone who doesn't see/doesn't know about/doesn't care about your little micro-stutter!

To recap, so the micro-stutter freaks who only read summaries and see what they want to see can't argue with me:
- Is "microstutter" present in some way that can be measured? YES.
- Is "microstutter" something that every single person perceives? NO.
- Can you accurately measure microstutter with your eyes? NO...but if you cannot detect microstutter, it therefore follows that it does not matter.


You miss the entire point.
For now, we can still enjoy stutter free games on a single GPU.
But AMD is looking at multiple cores as the future..and if NVIDA follows I want to be sure that I don't get a DOWNGRADE in my visual experince...

Just like with shimmering we need to speak up, so the issue dosn't get ignored..."thanks" to people like you.
 
You miss the entire point.
For now, we can still enjoy stutter free games on a single GPU.
But AMD is looking at multiple cores as the future..and if NVIDA follows I want to be sure that I don't get a DOWNGRADE in my visual experince...

Just like with shimmering we need to speak up, so the issue dosn't get ignored..."thanks" to people like you.

Dude lay off the microstuttering horse, you are welcome to come to may place and i will run more the 30 games in front of you on my 4870x2 and if you can show me microstuttering i will eat the card myself with the mustard :D it might be there under 30fps but then again even your single GPU will stutter, cause of low FPS, but once you lock it at 60fps its just butter smooth frames, been having SLi, Gx2's, Crossfire for now more then 4 years and even the original 3dFX SLi, suddenly now we have this "microstuttering " for last 10 years millions didn't see it now we have you to prove us it's there.

Just relax and let us enjoy our Microstuttering 8xAA 300fps and you enjoy your 60 fps stutter free single GPU's.
 
It's new technology in the sense that physics can now be calculated on a GPU without the need for a dedicated and costly PPU.

It failed in the form concieved by Ageia because no one was prepared to fork out money for another add-in card when there were no killer apps which took advantage of it, the paradox being that developers were not prepared to develop a killer app because people weren't buying PPU's.

What nvidia has done has now changed the landscape for Physx, mass adoption of PPU's is no longer a barrier to developing games which harness it's power because anyone who has a modern nvidia GPU can take advantage of Physx. The fact that quite a few new games are implementing Physx is a testament to this fact.

To make a blanket statement that Physx is dead technology is short sighted and misconceived.

Yeah but since Nvidia is the only one with it, will AMD/ATI have to come out with their own version or license it from Nvidia(if that is even possible)?
 
- Is "microstutter" present in some way that can be measured? YES.
- Is "microstutter" something that every single person perceives? NO.
- Can you accurately measure microstutter with your eyes? NO...but if you cannot detect microstutter, it therefore follows that it does not matter.


See this is my contention, yes microstuttering can be measured, therefore it should be measured, accurately, and compared to single GPU systems. If this were done accurately I'd be more than happy to say microstuttering doesn't matter.

To say it doesn't matter because you can't accurately measure it with your eyes is just silly. Can anyone here honestly say they can tell the difference between 35fps and 40fps if the game weren't telling them the framerate? Seriously? I sure as hell couldn't, I doubt many people could. HOWEVER!! Would you buy the card which is consistantly 15% better than the other card?? Of course you would. But wait, didn't I just say you can't tell the difference between 35fps and 40fps? This is around 15%... so logically, by your argument, it doesn't matter and should not affect our choice.

There are HEAPS of things that can not be measured with our eyes, yet we still use them to decide whether to buy one product or the other. Instead we use information given to us by monitoring programs and weigh up the information and decide on which product we want.

I'm still not saying microstuttering is an issue. I'm saying I'd like comprehensive and trustworthy testing done on it. If you were all willing to send me your computers I'd be happy to test it. :p Or pay for the flight to get to your house and test it for you. :p I'd be happy to see comprehensive testing and see that it doesn't exist... that'd strengthen my desire to get a 4850X2 when it comes out. I'm just not willing to call subjective opinions a conclusion on the matter. If it were science and engineering never would exist. :p
 
So ATI can use their cards to help render physX? Sweet!

I heard NV is offering help to AMD to do this but AMD wasn't that interested. It's a competing tech after all.

But since R770 pushes more flops it would be damn funny if physx runs faster on AMD cards. Just like there was a review where the best laptop for vista in a round up turned out to be a Macbook.
 
Yup...and notice how the anti-PhysX crowd suddenly likes PhysX...because now it's green :rolleyes:
It's actually because hardware PhysX is a useful thing now that NVIDIA's able to accelerate it via CUDA. So, we actually have cost-effective options for hardware physics now when we didn't before.

Many NVIDIA owners have gotten hardware physics for "free", which I think is something to get excited about, don't you think?
 
Maybe your right, maybe physx is something to get excited about. Isn't physx suppose to be used on ati cards soon? It would seem to me that physx needs to be supported by all video cards in order for it to be successful...I think that's why Nvidia is helping ATI get it on their cards.
 
It's actually because hardware PhysX is a useful thing now that NVIDIA's able to accelerate it in via CUDA. So, we actually have cost-effective options for hardware physics now when we didn't before.

Many NVIDIA owners have gotten hardware physics for "free", which I think is something to get excited about, don't you think?

I do! :D

I would just like to see it get used more...

I hope both sides can run it.
 
I will be impressed with physx as soon as I really see it do something. and this is coming from a Nvidia card owner, in fact the last 4 cards I have had
 
Physx still is nothing to really boast about, as games aren't utilizing it right now...and who know if most games will even implement it. It's not like your going to see higher fps because of it...or a notable performance increase. Now the XSP sideport feature on the 48780x2....that's extra bandwidth which is going to translate into better gaming performance. That is something to really get excited about.
 
Maybe it is and maybe it isn't, at least regarding "better performance". It sounds like it'll have the capability to eliminate MS, which is pretty nice in my book (get on the ball, NVIDIA), but AMD's already commented that any performance advantage would be pretty minimal.

And, yeah, last I heard, NVIDIA's still helping AMD with PhysX support, but not directly. I don't doubt that we'll see PhysX-enabled AMD drivers "soon", which makes hardware physics really exciting.
 
Please Stop the Microstuttering references. 90% of people didn't even notice it until someone pointed it out. I was once in a microstuttering thread and a poster asked "What should I look for to tell if its going on?"

As for PhysX, It won't go anywhere until both ATI and NV support it, because game developers would otherwise have to make the game so compatible with different setups that there would be almost no gain in performance. That is why xbox360 and Ps3 games still look good today, developers only need to code for one hardware scenario.
 
Haven't heard any complaints about microstuttering from actual 4870x2 users. Any who, I think the XSP feature could quite possibly be the rabbit in AMD's hat. It's just obviously a matter of when AMD wants to pull the rabbit out. I think it just might come out to stay ahead of Nvidia. We'll soon see I guess.
 
Didn't AMD say that the feature would add costs and power usage to the boards and that's why it's not enabled? If that's the case, I doubt the current batch of boards will have the traces for XSP, or they are just smoke screening. I know that would piss me off, if I paid $500+ for a top-end card that is intentionally crippled.
 
So we have a 280, that the 4870x2 can beat,
and Nvidia is working on taking the crown back, with their next card...
So ATI has Sideport to counter the new card,
But Nvidia has Big Bang II to counter Sideport...
I cant wait to see what's next! :p

I think we have a little 'Smoke and Mirrors' action going on from both camps right now! ;)
 
http://www.anandtech.com/video/showdoc.aspx?i=3372&p=3

I pushed AMD for a firm commitment on how it was going to handle future support for Sideport and honestly, right now, it's looking like the feature will never get enabled. AMD should have never mentioned that it ever existed, especially if there was a good chance that it wouldn't be enabled. AMD (or more specifically ATI) does have a history of making a big deal of GPU features that never get used (Truform anyone?), so it's not too unexpected but still annoying.
 
Anandtech also said the 4870x2 was going to have 512mb x 2 memory in their preview after working with the card. I don't think they are on the ball over there. ATI might just be putting up a little smoke screen to make it seem like they aren't holding anything back. Only time will really tell. ;)
 
I think nVIDIA G 280 X2 will easily outperform Radeon 4870 X2, but then thats just my openion. SPecially since G280 is comming out with DDR5. Anyways no upgrade for 12 months for me since we are going to iraq for a year. By the time we come back. Ill have to built a whole new Rig once again.
 
So we have a 280, that the 4870x2 can beat,
and Nvidia is working on taking the crown back, with their next card...
So ATI has Sideport to counter the new card,
But Nvidia has Big Bang II to counter Sideport...
I cant wait to see what's next! :p

I think we have a little 'Smoke and Mirrors' action going on from both camps right now! ;)

I love that word 'Smoke and Mirrors'

And I love competition , who got bigger ballZ
 
I think nVIDIA G 280 X2 will easily outperform Radeon 4870 X2, but then thats just my openion. SPecially since G280 is comming out with DDR5. Anyways no upgrade for 12 months for me since we are going to iraq for a year. By the time we come back. Ill have to built a whole new Rig once again.

If the clocks are high enough, yeah a GX2 should perform better assuming just a die shrink.
 
I think nVIDIA G 280 X2 will easily outperform Radeon 4870 X2, but then thats just my openion. SPecially since G280 is comming out with DDR5. Anyways no upgrade for 12 months for me since we are going to iraq for a year. By the time we come back. Ill have to built a whole new Rig once again.

I don't think so, if you look at the 9800gtx+....you really didn't get a performance improvement over the standard 9800gtx...and the 4850 still owned it. GTX280b/GTX280+ will be the same thing...only difference is that it will have better yields, run cooler, and Nvidia will be able to sell it at a better cost.

If it's a GX2...then hey, I would think it would perform better than a 4870x2. By then ATI will turn on XSP to maintain the lead.
 
If it's a GX2...then hey, I would think it would perform better than a 4870x2. By then ATI will turn on XSP to maintain the lead

And how do you know that?
You have no idea if it will improve performance.
IMO if Nvidia makes a GTX280GX2 (which is just a fanboys stupid wetdream)
NVIDIA-to-Release-GTX-280-GX2-25-1.jpg

Then ATI will have lost the performance lead but no doubt it will still be competitive cause I doubt they could keep the GX2 prices low or supplies high.
Fanboy thread is dumb thread.
 
ATI has stated that Sideport isn't enabled because they weren't able to get better performance out of it. If/when they do figure out how to take advantage of it, they'll enable it. Considering this is a feature that was built into RV770 from day 1, I really doubt they're trying to keep it as some sort of secret weapon against nVidia (otherwise, they certainly wouldn't have said that they weren't able to get better performance out of it and they'd be playing up the "potential benefits").
 
"The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."

- techpowerup

anandtech seems to be about as accurate as fud lately.
 
It's not even confirmed the current cards have the traces for XSP, so there may be no "enabling" it with a driver update anyway. I'm sure if it can be enabled with a simple driver update, and the cards do have the traces, we won't have to wait for ATi to enable it as some enthusiast will hack the drivers for us.

I also don't see how they can say that the bandwidth is not needed, when it clearly is unless it's broken right now.
 
"The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."
.

I also don't see how they can say that the bandwidth is not needed, when it clearly is unless it's broken right now.

Yeah, since when can you have too much bandwidth?
 
My thoughts exactly, I had this micro stuttering on my 8800GTS and it went away when I upgraded my system to my E8400 from my X2 4200. Seems to me, this micro stuttering could as easily be a CPU problem as well as a GPU / driver problem.

Lol I think that was just your CPU was causing minimum framerates that were visibly below "smooth" (30fps).

Firingsquad had a CPU scaling test for Crysis (on DX9 High, 1920x1200 no AA/AF) when the Geforce 8800GT came out. The average framerate was maybe 4-5 fps apart between the Q6600 and the AMD 3800+...but the minimum framerate was a difference of 12 (8fps to 20fps)...and with motion blur on 17+ fps still looked pretty smooth.

That's far more noticeable than what this "microstutter" is (as is screen tearing), since to "see" microstutter would actually require you to be able to mentally compute visual cues within 1/10 of a second...

As far as normal play, a 60fps to 40fps in 1/15 second interval stutter is not going to be noticeable considering the latency that lcds have (the average latency being usually 3 to 5 times worse than the manufacturer spec).
 
Watch the video and looks closely if you carefully- Maxishine's GTX 280 Tri-SLI.

http://www.youtube.com/watch?v=8Orsk65ib5c&feature=related

GTX 280 Tri-SLI produce high framerates while it was sluttering which mean it does skip some frames. So this is called microsluttering.

It was reporting 45fps and the game literally stopped for a second. That can't possibly be micro-stuttering, that would just be a catastrophic failure of SLI, more like halting stop. For a game to drop down to ZERO fps momentarily either requires area caching or the GPUs taking a crap.

So that can't possibly prove microstuttering, it would only prove that the game hangs for entire seconds in SLI mode.
 
Back
Top