It feels like the creation of the Perfect Storm of PC problems: Vista 64, Nvidia drivers, and a mix of cards. All I need is some dramatic music as I BSOD, and the nightmare would be complete.
Short story:
My current setup consists of a 975X MSI board, e6600, 4GB of RAM, and two video cards: An 8800GTS, and a 8500GT.
I want to run 3 monitors, and the 3 monitors are:
Main: Dell 3007, 2560x1600
Secondary (attached to 8800GTS): Viewsonic 22", 1600x1200
Tertiary (attached to 8500GT): Viewsonic 22", 1600x1200.
The good news? Each card runs fine when the other card is disabled/not in the system. Hell, the 8500GT can run the Dell monitor, and it's quite amusing to do so.
However, as soon as I attempt to start the machine with both cards in/enabled, disaster strikes.
Generally, the system gets to right before I press "ctrl-alt-delete" (domain ftw) to login, and the screen just stays black. The Dell monitor stays powered, as in, it doesn't "go to sleep", but just remains black.
After a few minutes (literally, minutes: I can take a shower and come back, and it's still doing it) it BSODs. No driver mentioned in the BSOD, but 0x000000124 is in the BSOD. Googling this brings up... well, not much of use.
I've tried every driver I know: The Nvidia released ones, the Betas, all of them with the same problem.
The odd thing is, though, I've gone through a few generations of Nvidia cards, and the 8 series together work the "worst", I guess you'd call it. I had a 6200 in the system before, and while the drivers loaded fine, it BSODs/crashed hard core whenever I went in to enable the monitor in Display Properties->Settings. That is, the system saw the video card, saw that it had 2 monitor connections, but just crashed when I tried to enable it.
The same problem with the 6200 happened when I tried a 7600GS. Drivers installed fine, just crashed when trying to enable.
Then I went to Circuit City (god...) and bought the 8500GT, since I wanted to try it. Ironically enough, it rang up wrong, and I'll be getting it for ~90 bucks. Still too much for a piss poor card like this, but hey, it's a third-monitor card. But I digress.
So, any ideas, suggestions, mockeries? I'm trying to do everything right here: I went out and bought a "LOL OMG NEXT GEN!" video card to run with my "LOL OMG NEXT GEN" video card that is actually good, and I'm not trying to jury rig a GeForce4 to run with my 8800. This shouldn't be this hard. It simply shouldn't be. I feel like I'm missing something.
Please help me Obi-Wan, you're my only hope. Or something.
And for the record, I did "report a bug" with Nvidia, which I'm sure will be as useful as pissing into a wind, and trying to stay dry. Link was here, for the report bug: http://www.nvidia.com/object/vistaqualityassurance.html
Short story:
My current setup consists of a 975X MSI board, e6600, 4GB of RAM, and two video cards: An 8800GTS, and a 8500GT.
I want to run 3 monitors, and the 3 monitors are:
Main: Dell 3007, 2560x1600
Secondary (attached to 8800GTS): Viewsonic 22", 1600x1200
Tertiary (attached to 8500GT): Viewsonic 22", 1600x1200.
The good news? Each card runs fine when the other card is disabled/not in the system. Hell, the 8500GT can run the Dell monitor, and it's quite amusing to do so.
However, as soon as I attempt to start the machine with both cards in/enabled, disaster strikes.
Generally, the system gets to right before I press "ctrl-alt-delete" (domain ftw) to login, and the screen just stays black. The Dell monitor stays powered, as in, it doesn't "go to sleep", but just remains black.
After a few minutes (literally, minutes: I can take a shower and come back, and it's still doing it) it BSODs. No driver mentioned in the BSOD, but 0x000000124 is in the BSOD. Googling this brings up... well, not much of use.
I've tried every driver I know: The Nvidia released ones, the Betas, all of them with the same problem.
The odd thing is, though, I've gone through a few generations of Nvidia cards, and the 8 series together work the "worst", I guess you'd call it. I had a 6200 in the system before, and while the drivers loaded fine, it BSODs/crashed hard core whenever I went in to enable the monitor in Display Properties->Settings. That is, the system saw the video card, saw that it had 2 monitor connections, but just crashed when I tried to enable it.
The same problem with the 6200 happened when I tried a 7600GS. Drivers installed fine, just crashed when trying to enable.
Then I went to Circuit City (god...) and bought the 8500GT, since I wanted to try it. Ironically enough, it rang up wrong, and I'll be getting it for ~90 bucks. Still too much for a piss poor card like this, but hey, it's a third-monitor card. But I digress.
So, any ideas, suggestions, mockeries? I'm trying to do everything right here: I went out and bought a "LOL OMG NEXT GEN!" video card to run with my "LOL OMG NEXT GEN" video card that is actually good, and I'm not trying to jury rig a GeForce4 to run with my 8800. This shouldn't be this hard. It simply shouldn't be. I feel like I'm missing something.
Please help me Obi-Wan, you're my only hope. Or something.
And for the record, I did "report a bug" with Nvidia, which I'm sure will be as useful as pissing into a wind, and trying to stay dry. Link was here, for the report bug: http://www.nvidia.com/object/vistaqualityassurance.html