I could never detect CRT scan lines, but can easily see them thru my video cam........so where does that leave me?
On a sidetrack...
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I could never detect CRT scan lines, but can easily see them thru my video cam........so where does that leave me?
But if he doesn't notice it, it doesn't matter, does it?
You come flaming everyone here with your 'evidence' in the form of some graphs, but what's the relevance of those if you're measuring something that's not visually perceivable?
I have a 9800GX2, I should have micro-stutters too, but so far I haven't managed to notice anything, I even tried a few tests and paid extra attention, but none whatsoever.
If it's technically measurable like in your stupid little graphs, I sure believe you. It's there. But I cannot see it, so I don't care. This has nothing to do with ignorance whatsoever. Why should I change my hardware for something I cannot see? Why do these micro-stutter crusaders keep trying to tell everyone not to buy hardware setups that include micro-stutter, when it's actually of no relevance?
QFT, as I have seen a few die-hard users going on and on and on about their little micro-stutter problem. I have a 9800GX2, one of the cards "crippled" by this disease. Yet, I am perfectly happy with my card, because I don't notice any negative affect from this problem. I'm not denying it exists, but I agree with the above that if I (and apparently many others) cannot notice it without deliberately searching for it, it thus does not have a negative affect on our gameplay, and therefore it is acceptable to not care! We might be able to get higher/smoother frames without it, but again if we're not getting owned because of it, and we're still able to ENJOY OUR GAMES, then who the F cares? Let's say that there's a rendering defect in a game that causes a character to have an extra triangle in his/her foot. But becuase of the positioning of the erroneous polygon, only by pausing the game and activating NOCLIP (or using the level editor) can one see this defect. Is it there? YES. Does it matter? NO, if you don't notice it.
Now, if you're one of those people who, simply by knowing about a mistake is offended by it, okay, but please for the love of God stop flaming everyone who doesn't see/doesn't know about/doesn't care about your little micro-stutter!
To recap, so the micro-stutter freaks who only read summaries and see what they want to see can't argue with me:
- Is "microstutter" present in some way that can be measured? YES.
- Is "microstutter" something that every single person perceives? NO.
- Can you accurately measure microstutter with your eyes? NO...but if you cannot detect microstutter, it therefore follows that it does not matter.
You miss the entire point.
For now, we can still enjoy stutter free games on a single GPU.
But AMD is looking at multiple cores as the future..and if NVIDA follows I want to be sure that I don't get a DOWNGRADE in my visual experince...
Just like with shimmering we need to speak up, so the issue dosn't get ignored..."thanks" to people like you.
It's new technology in the sense that physics can now be calculated on a GPU without the need for a dedicated and costly PPU.
It failed in the form concieved by Ageia because no one was prepared to fork out money for another add-in card when there were no killer apps which took advantage of it, the paradox being that developers were not prepared to develop a killer app because people weren't buying PPU's.
What nvidia has done has now changed the landscape for Physx, mass adoption of PPU's is no longer a barrier to developing games which harness it's power because anyone who has a modern nvidia GPU can take advantage of Physx. The fact that quite a few new games are implementing Physx is a testament to this fact.
To make a blanket statement that Physx is dead technology is short sighted and misconceived.
Yeah but since Nvidia is the only one with it, will AMD/ATI have to come out with their own version or license it from Nvidia(if that is even possible)?
- Is "microstutter" present in some way that can be measured? YES.
- Is "microstutter" something that every single person perceives? NO.
- Can you accurately measure microstutter with your eyes? NO...but if you cannot detect microstutter, it therefore follows that it does not matter.
The API is free...
So ATI can use their cards to help render physX? Sweet!
So ATI can use their cards to help render physX? Sweet!
http://www.tomshardware.com/news/nvidia-physx-ati,5764.html
http://www.bit-tech.net/news/2008/07/09/nvidia-helping-to-bring-physx-to-ati-cards/1
Yup...and notice how the anti-PhysX crowd suddenly likes PhysX...because now it's green
It's actually because hardware PhysX is a useful thing now that NVIDIA's able to accelerate it via CUDA. So, we actually have cost-effective options for hardware physics now when we didn't before.Yup...and notice how the anti-PhysX crowd suddenly likes PhysX...because now it's green
It's actually because hardware PhysX is a useful thing now that NVIDIA's able to accelerate it in via CUDA. So, we actually have cost-effective options for hardware physics now when we didn't before.
Many NVIDIA owners have gotten hardware physics for "free", which I think is something to get excited about, don't you think?
I pushed AMD for a firm commitment on how it was going to handle future support for Sideport and honestly, right now, it's looking like the feature will never get enabled. AMD should have never mentioned that it ever existed, especially if there was a good chance that it wouldn't be enabled. AMD (or more specifically ATI) does have a history of making a big deal of GPU features that never get used (Truform anyone?), so it's not too unexpected but still annoying.
So we have a 280, that the 4870x2 can beat,
and Nvidia is working on taking the crown back, with their next card...
So ATI has Sideport to counter the new card,
But Nvidia has Big Bang II to counter Sideport...
I cant wait to see what's next!
I think we have a little 'Smoke and Mirrors' action going on from both camps right now!
I think nVIDIA G 280 X2 will easily outperform Radeon 4870 X2, but then thats just my openion. SPecially since G280 is comming out with DDR5. Anyways no upgrade for 12 months for me since we are going to iraq for a year. By the time we come back. Ill have to built a whole new Rig once again.
I think nVIDIA G 280 X2 will easily outperform Radeon 4870 X2, but then thats just my openion. SPecially since G280 is comming out with DDR5. Anyways no upgrade for 12 months for me since we are going to iraq for a year. By the time we come back. Ill have to built a whole new Rig once again.
If it's a GX2...then hey, I would think it would perform better than a 4870x2. By then ATI will turn on XSP to maintain the lead
You talk is dumb talkFanboy thread is dumb thread.
"The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."
.
I also don't see how they can say that the bandwidth is not needed, when it clearly is unless it's broken right now.
My thoughts exactly, I had this micro stuttering on my 8800GTS and it went away when I upgraded my system to my E8400 from my X2 4200. Seems to me, this micro stuttering could as easily be a CPU problem as well as a GPU / driver problem.
Watch the video and looks closely if you carefully- Maxishine's GTX 280 Tri-SLI.
http://www.youtube.com/watch?v=8Orsk65ib5c&feature=related
GTX 280 Tri-SLI produce high framerates while it was sluttering which mean it does skip some frames. So this is called microsluttering.
"The official reason why the XSP is disabled at this time is because that much bandwidth is not required with current applications and it will be enabled at some point in the future via driver update."
- techpowerup
"When we asked why Sideport was disabled, we were told that there was virtually no difference in performance (in current games at least) between it being enabled and not."
http://www.bit-tech.net/hardware/2008/08/13/amd-ati-radeon-hd-4870-x2/2