Search results

  1. A

    What happened to the LG W3000H?

    I agree that the ZR30W is a fantastic screen (I own one too) :) It's looking like I'm going to have to sell it though, as it won't talk to my other two screens in an nvidia surround setup (Hazro HZ30Wi - also with a low input lag at 14ms). In theory I could sell the two Hazros, and replace...
  2. A

    Sync polarity - does anyone understand this beast? (Re: nvidia surround problems)

    I'm having problems trying to switch the sync polarity of one of my screens... I have an HP ZR30W (30" 2560*1600 screen), and recently bought two Hazro HZ30Wi (same size and res) to work with my pair of GTX480s in an nvidia surround setup. Surround would not configure, giving me a message...
  3. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    To briefly address your points: First, as has been stated many times, under CPU-limited circumstances you will see very little microstutter, as the GPU output syncs to the regular output of the CPU. I don't have civ 5 to test, but I can't imagine it's GPU intensive enough to stress a 5970...
  4. A

    NV GTX 460 1GB SLI vs. ATI HD 5870 CFX Redux @ [H]

    Read the thread I have on the subject. Like I said, it isn't a "noticeable" phenomenon, in that you look at a game scene and say "hey, this is stuttering". It just works to reduce the apparent framerate down from what the thread counter says. In other words, 60fps with microstutter is not the...
  5. A

    NV GTX 460 1GB SLI vs. ATI HD 5870 CFX Redux @ [H]

    Any chance you can consider framerate irregularity (aka "microstutter") when comparing the two setups? This phenomena has a real impact on the performance of a setup, which does not show up in raw-FPS benchmarks. If you're interested I wrote a program to quantify the apparent performance...
  6. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Um... :confused: The name is not representative, I agree, and micro-fluctuations is a better way of describing the effect. I said in my original post that microstutter was a crappy name for the effect, as it is not really stuttering. But micro-fluctuation is precisely what I am measuring...
  7. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Yes, you should :) As NKDietrich pointed out, adding a second GPU still improves real-world performance in almost all circumstances. First, consider GPU-limited circumstances (which will be most of the time in demanding games at 2560 res). Adding a second GPU will generally net you a 70 to...
  8. A

    Valve surprises

    I'm hoping / expecting that HL3 is one of them. I think valve realised at some stage that the idea of episodic content wasn't really compatible with their "flagship" half-life title, since it takes so much time to refine and polish a high-quality FPS like that. It seems natural that episode 3...
  9. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Thanks for the reply, Argh - it's good to see someone else trying to draw attention to the microstutter problem! I completely agree that the lack of attention from review sites is contributing to the issue to being ignored or underestimated in the community. I think that if just one major review...
  10. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    As I mentioned before, it's not something you can really "just notice". The difference between microstutter and regular drop in FPS is, for the most part, indistinguishable. It's only when the framerate drops to a crawl (say <25fps), when you can catch individual frames being output, that you...
  11. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Thanks for the support, butterflysrpretty :) Perhaps I am overreacting... I don't really believe that there is some big conspiracy not to address microstutter. But, I do believe that it will take more awareness before anything is done about it - particularly because addressing it will require...
  12. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    The problem is, anything that can be done to reduce microstutter will mean at least a small reduction in average framerate (either switching to a "multiple GPUs, single frame" rendering method, or delaying output of frames in AFR mode). Since it's the average framerate that sells cards (via...
  13. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    I agree completely. I dislike the idea of AFR - irregular frame output is inevitable. Unfortunately, it's the mode which requires the least communication between GPUs, and so offers the best multi-GPU scaling in average FPS benchmarks. Since this is what the performance of the cards is judged...
  14. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Ignore the graphs - they are just for illustration. In the program I posted above, I have simply applied well known statistics to the data, in order to produce a non-dimensional quantification of the framerate variation away from a local mean. This gives a perception-independent...
  15. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Thanks :) I ran that frametime log in the program, and it came out with an idex of ~18 - similar to the GTX480s under full load in crysis: Worryingly however, looking closer at the frametime log there seems to be large periods of virtually no microstutter, mixed in with some periods of...
  16. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    The effect can be quantified - I did so in the program I posted above. Clearly it varies depending on the game, and the hardware (what doesn't?), and also the GPU load, but the effect on the person is not going to change. As I stated earlier, it isn't something you either notice or don't - it's...
  17. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Well, it's not something that you can "just notice". It's not like tearing, or visual artifacts. Without two game scenes running side-by-side at the same framerate, you don't know it's there. It just reduces the effective framerate of the game. That is, the framerate your FPS counter is showing...
  18. A

    Microstutter in latest-generation cards (GTX480 examples inside)

    Okay, I posted this over at XS-forums as well, but I'd love to generate some awareness with the [H] community as well. It seems to me that microstutter is a massively overlooked feature of multi-GPU setups, which really makes a difference to gameplay. Since [H]ardOCP have always been at the...
  19. A

    HD58xx is already out

    A paper launch is fine by me. I'm not going to be able to get a card until they're actually available, and that isn't going to happen until there are enough cards actually being produced. If I can get an idea of the performance in the mean time, then all the better. I don't really see the...
  20. A

    Whats the deal with 4870 and vsync

    The control-panel option will enable 'regular' double-buffering vsync in D3D or OGL, but will only enable triple-buffering in OGL. This is quite annoying, since triple buffer vsync is clearly superior to double-buffer!
  21. A

    DX9L

    This seems logical. After all, DX10 is one of MS's main selling points for Vista. Why would they offer it freely on XP to dissuade people from upgrading? It makes no business sense...
  22. A

    G80 Pr0n

    Brilliant... :D
  23. A

    ... How do i enable tr aa on my 7800gt?

    Don't bother with 8xS AA. The reason you're experiencing delays is most likely paging between video and system memory. When using super-sampling AA, like 8xS does, you need to effectively evaluate a higher resolution map of the texture, before rendering it down. As well as being very...
  24. A

    "Quad" SLI = Bad math?

    Scalable Link Interface http://en.wikipedia.org/wiki/Scalable_Link_Interface (sorry :o )
  25. A

    Random reboot problems... (minidumps attached)

    Thank you :) The odd thing is, though, that it will run memtest all nigh no problems (it's built into my DFI BIOS). Could it be a cache memory issue with the CPU?
  26. A

    A64 OC Data

    Got an Opteron 146 today (1mb cache, 2.0Ghz stock). Running at 2.9Ghz at the moment, on a stock FX55 cooler (I broke the opty's stock cooler when installing it :o ). 3Ghz is pretty unstable. Prime fails after about a minute, and 8M super-pi fails, but at 2.9 it's rock solid. I might see...
  27. A

    Random reboot problems... (minidumps attached)

    Ah - thank you! Yes - I can run dumpchk now :) Now, I just need to learn to make sense of this info :p I think that being able to read minidumps would be a useful tool for the future though, so it's worth doing.
  28. A

    Do I need a new CPU to match my nVidia 7800GTX?

    You'll be fine. First up, CPU limitation is a game-by-game, scene-by-scene thing. You will be CPU limited in certain parts of some games, whereas in others (FEAR for example) you'll still be very much GPU limited throughout. I'm running an Opteron 144 (effectively a 3000+ with 1mb cache)...
  29. A

    Random reboot problems... (minidumps attached)

    I've been having problems with random reboots. This problem has persisted over many recent hardware changes, and a format, so I'm having difficulty pinning it down. I generally get reboots roughly every couple of hours, but after one it seems much more likely that I will get another soon...
  30. A

    not exact SLI?

    Not quite. You need the same BIOS version on both cards. You can flash the BIOS of one of the cards, if they're not compatible.
  31. A

    XFX 7800GTX OC wow...

    Well that's what I'm doing. The thing is though, in the computer hardware world you can sit around forever waiting for prices to drop and new hardware to come out. There's no fixed date on when the new cards will actually be available from ATI and nvidia, although I would hope they're both...
  32. A

    ATI a takeover target?

    Then buy some cheap stock! :D
  33. A

    XFX 7800GTX OC wow...

    Heh :p well, I've got a pair of 6800GTs at the moment, so I'm wanting to see a reasonable performance increase next time I upgrade. At the moment I'm getting similar performance to a single 7800GTX, but a 32pipe card should allow me to go back to a single card, and still see a performance...
  34. A

    XFX 7800GTX OC wow...

    I'm hoping that either the 'ultra' or one of the r520 variants will be a 32pipe card. If not, I think I'll be disappointed.
  35. A

    R520 Delayed?

    It's the only logical explanation.
  36. A

    Nvidia GeForce 7800GTX SLI scores 22,000 in 3dmark05

    Bear in mind that those who have an SLI system are more likely to have other high-end hardware to go with it, and be more enthusiatic about or skilled in clocking / optimisation. A lot of the single GT scores on that chart are run at stock speed, which brings down the average somewhat. None of...
  37. A

    Nvidia GeForce 7800GTX SLI scores 22,000 in 3dmark05

    We don't see a 100% increase from SLI, even in a idealised environment like 3dmark05. I get around 6,000 with a single 6800GT, and 10,500 maximum with them in SLI. You would also expect the percentage increase from adding a second card to decrease as you move towards a higher fill rate, and...
  38. A

    Nvidia GeForce 7800GTX SLI scores 22,000 in 3dmark05

    The number of pipelines is the number of pixel-shader units running in parallel. If you have 2 16pipe cards running in parallel (with SLI or crossfire), then you have 32 parallel pipelines. Simple as that. You could be thinking of memory. Since each card has to store the entire texture set...
Back
Top