True. I guess that's part of the nostalgia, if you happened to be in the mood for playing around with those settings. If you weren't, it could be frustrating. Then again, around that time I was fascinated by pre-1.0 Linux, dual-booting, and recompiling kernels, too.
I have a PC and a 360. Lately, I've been playing the PC a lot more, but that'll probably go in phases. It depends on tastes, but I've played the following by myself on the 360:
- Halo 3 campaign
- Beautiful Katamari
- Guitar Hero 2/3
- Dance Dance Revolution Universe 2
- Eternal Sonata
-...
Of the three cards considered, I'd suggest going with either the GTX 260 or the 4870 depending the particular games played. G92 SLI (9800GX2 included) has a tendency not to play as smooth as the average framerate would suggest. You'd be happier with a single more powerful GPU.
Some of Nvidia's mobile G84/86 GPUs were bad. That much is known.
However, there is no evidence to substantiate the FUD that G80/G92/G92b cards are bad. I had two 8800GT cards and both worked perfectly well. Others' G92 cards are still kicking right along without issues.
The GTX 200 series...
Just curious: where did you hear that? I was under the impression that both the GTX 260 and 280 were always going to be 10.5" long. (The 8800GT and GTS are 9", and the 9800GTX(+), 9800GX2 and GTX 260/280 are all 10.5".)
Yup, I just turned 24 and still remember having to do that with my 386. That, and editing the FILES=XX, BUFFERS=YY, and STACKS=ZZZZ commands in config.sys.
Yes, more expensive, but I'd consider it if budget permits. Going SLI with G92 could be adequate, but it could also show its age sooner than the OP would like, especially with the fall wave of games. This is particularly true if the OP likes AA.
Yes. An 8800GT has a TDP of 112W.
He stated...
Basically, when the average framerate, for example, is reported as 45 but your gameplay feels perceptibly worse than 45 -- say <30.
It won't occur in every scenario, and not everyone is sensitive to it.
No, different speeds and manufacturers are ok. Faster cards will downclock to speed of...
I've owned 8800GT, SLI 8800GT, GTX 280, and (dual) SLI GTX 280. For me at least (at 1080p), SLI 8800GT microstuttered badly. One thing to keep in mind is that both the 8800GT and 9800GX2 use G92 cores, and most 9800GX2 owners who switched to the GTX 200 series were much happier with the latter...
One last thing: Are you counting the length of the PCI-E connector that fits into the end of the card? The 8800GTS is itself 9", but you really need another inch or so at least to fit the conector there. The GTX 200 series is 10.5" long, but takes PCI-E connectors on the top of the card, so you...
I have a 750i (Asus P5N-D). Other than voltage tinkering, the only issue that I've had with it is that the SATA RAID driver does not like the presence of SATA optical drives (i.e., invariably bluescreens on startup), even if RAID isn't active. This happened with both XP Pro 32-bit and Vista 64...
My guess is that the Northbridge or FSB/VTT voltage needs to be bumped up a bit. I have a 750i board, and what was stable for me with 2 sticks had to be adjusted a bit for four sticks.
It could be your PSU, but I'd fiddle around with voltages a bit first. Nvidia chipsets seem to be less...
A good 800W might cut it. Personally speaking, I have had PSU issues in the past, though, including, most recently, a Cooler Master Real Power Pro 750W that didn't like one GTX 280 (the high-pitched screech, and occasional lock-ups if I let that go on for a few minutes), let alone two.
So I...
Visiontek XG6. Most important reason: Three 4-way control switches on it allow me to play most FPS games one-handed; I don't have enough dexterity in my left hand to do anything gaming-related with it.
It's not his fan that's whining. It's a capacitor somewhere.
OP, see whether you can determine where exactly the high-pitched noise is coming from, and post your system specs. There is a possibility that the PSU is overworked, although that hasn't always been the case with these problems.
That's probably not the fan, but rather coil or capacitor whine. It seems to be a fairly common issue with the power-hungry GTX 200 series. Depending on configuration, the whine could be coming from the PSU, video card, or motherboard. In my case, I actually resolved it by upgrading my PSU.
As far as I know, the recent Nvidia chipsets (6x0i and 7x0i) are in general harder to FSB-overclock than Intel chipsets, all other things equal. That said, it is possible to get a decent overclock (I have my Q6600 at ~3.4GHz), but you'll likely have to live with higher voltages.
Just a guess, but probably because there's simpler/less logic involved if everything is a power of two. The 3x factor in there would probably complicate algorithms.
A single GTX 280 should serve you well at 1920*1200. Get one, and decide whether you actually need another. If you do, chances are that you're into Crysis or Age of Conan. ;)
720W for two GTX 280 SLI would scare me. You're going to need at least about 40A on your PCI-E connectors. You might...
A 9800GX2 benchmarks well, especially in average framerate, but the vast majority of previous owners of 9800GX2 and other SLI G92 core setups are much happier with a single GTX 200 series card, which will outclass the 9800GX2 in minimum framerate, which is important for determining playability...
Well, if it's any consolation, I think that it's gotten better with the current generation of cards. I was very vocal here about my displeasure with the microstuttering that I saw in SLI 8800GT, but I'm very happy with my SLI GTX 280.
Pretty much. The problem is that many people's eyes will "latch" onto the longest gap between frames and perceive that as the actual framerate for the purposes of smoothness and playability. The term "microstuttering" is supposed to be used to denote uneven framerates due to AFR timing issues...
EVGA. Worthwhile factory overclocks on the high end (I like the guarantee of an non-trivial overclock), and step-up.
That said, I've bought BFG and XFX in the past, too, and haven't been disappointed.
The problem is that the next frame could take significantly longer or shorter than the first, depending on what's different. I remember reading somewhere that Crysis is/was notorious for "breaking" AFR prediction heuristics in Nvidia's drivers.
Back when I had SLI 8800GT, I actually used a...
I noticed it fairly badly on my SLI 8800GT setup. With SLI GTX 280, it's still there, but much less frequent and much more tolerable when it does occur.
Currently in the middle of:
Oblivion
BioShock
Crysis
Halo (1) PC
Nothing earth-shattering here. I've also been thinking about resurrecting dormant saves for Puzzle Quest and Eternal Sonata on my Xbox 360.
I did a clean install of Vista 64-bit Ultimate OEM (with SP1 *not* initially installed) with 8GB RAM on an Asus P5N-D (750i)-based system. No memory-related problems. (I did have a known issue with part of the Nvidia SATA driver not playing nice with my SATA optical drives, but that's unrelated...
SLI (and a bunch of other things) were reportedly broken on Crysis before 1.2. If you haven't patched to 1.2.1, do so first. I would guess that you should expect something like 40-50% scaling in Crysis at 1680*1050 with SLI 8800GT over a single card.
Same. I didn't much like 8800GT SLI with 174/175 series drivers -- which is primarily why I was so quick to get a GTX 280. I later tried the waters again with a second GTX 280. Much better. Still some microstuttering, but never unbearable as it was (for me) with G92.
Two factors common to...
"At what settings?" is the other, implicit half of that question, although if someone's looking for something better than pseudo-Very High custom configs, my guess is the next step would be stock Very High. ;) At 1680*1050 0xAA 16xAF, one GTX 280 would probably be a bit choppy in parts. Two...
Earlier versions of Crysis may not have, but, aside from a few glitchy sections where changing AA resolved the issue, I've never seen less than ~50% scaling in two-way SLI with either 8800GT or GTX 280, at 1080p.
I'd personally go with the GTX 260. Yes, the 9800GX2 may have a higher average framerate, but the 260 should have higher minimum framerates. Plus, if you like AA / high resolutions, the frame buffer and memory bus are larger. And you don't have to worry about whether a game scales in SLI...