AGP vs.PCI (voltage question)

TFR

Limp Gawd
Joined
Jan 22, 2004
Messages
134
Modern AGP slots deliver 1.5 volts, right? I was wondering how much juice can PCI slots deliver. I'm wondering because I can't seem to get my Apple 20" LCD to run off my father's PCI FX5200, yet it'll run just fine off my Radeon 9500 AGP card. The monitor's power button glows then turns off in 2 seconds, never showing anything on the screen when connected to the PCI card. I am using the external DVI-ADC connector. My best guess would be that the PCI card doesn't supply enough juice (volts, amps, wattage, take your pick) to power the monitor. Your thoughs appreciated.

-TFR
 
a pci slot will putout 3.3 to 5 volts depending on which pci version it is.
 
Okay...I guess it's not an undervoltage problem. Any other thoughts as to why the Apple LCD won't work with the PCI FX5200?
 
Other than the fact that the FX5200 is a steamy pile of crap?

Make sure that the card has the DVI port enabled. You can access these settings through the display properties in Windows.
 
Other than the fact that the FX5200 is a steamy pile of crap?

Yeah, it's not a [H]ard vic, but not everyone needs a 6800 or an x800.
 
Agreed, but even for low end, it sucks.

Nvidia took a chipset that struggled to do anything to begin with and dumbed it down.
Imagine taking a retard and then removing half it's brain. What you would have is the equivalent of the FX5200.

Anyway, did you check to make sure that the DVI port was enabled?
 
Ok...here's the situation. I happened to see on the FX5200's box that a 250W PSU is required. After a little snooping, I found out that my father's PSU was only 220W. Luckily, I had a 250W PSU in my parts bin and I swapped them. After an hour and a few reboots, I found out that the Apple monitor will power up only if I have another monitor connected to the VGA port at the same time. Here's the sequence of events:

1) Boot up computer. Post successful on the VGA-port monitor. Windows XP loading up.
2) Right before the Welcome message, VGA-port monitor looses video signal and Apple monitor powers up and displays the Windows XP desktop. I am now ok to use the computer with the Apple display even if I disconnect the other monitor.

I did check to see if the Nvidia drivers had the DVI enabled. Best I could find was the option to switch on anolog, digital, or both. I selected digital and the Apple LCD worked.
The only problem now is that whenever I reboot, the Apple LCD won't display anything if I don't have another monitor connected into the VGA port. Heck, the Apple display won't even power on. Am I forced to have 2 monitors connected to the video card in order to have the Apple LCD work? All I want is to be able to boot up using only the Apple display.

-TFR

p.s. I could leave the computer on 24/7 and just turn off the Apple monitor. Only problem is that the brightness and power buttons don't work when the display is on. There's no way to turn the damn thing off!
 
Oh and I did try the monitor with my system (see sig) and the Apple LCD works just fine.
 
TekieB said:
I was right about to do that
tongue.gif
 
Back
Top