so, I didn't wanna throw my monitor model # in the title, because I don't want this thread to be viewed as "yet another...."
I'm dealing with a Gateway XHD3000 though, and its gotten itself into a somewhat interesting predicament
at first I thought, oh god, its failed, like the whining guy on youtube with the green lines, although I don't think thats the case, given that it "works" fine
so basically what happens is, given a random period of time (sometimes right when its turned on, sometimes after its been on for a while, this seems to be related to temperature more than anything else (if it gets fairly warm or if the room is fairly cool (we're talking like 60* F, not like 20* F, yes I realize it has a minimum operational temperature)) it'll start to VERY mildly flicker, I've got an ancient 17" LCD that does this when the room gets cold enough, and after enough time it warms up and stops, the Gateway seems to do this (if its started when the room is cold, or if its shut off for a while) most of the time, although sometimes it just comes and goes
so I was thinking, well, the power brick is external, so I can rip it open and have a peek, so with a friend and a decent multi, we ripped her open and had a peek, roughly what we found:
all caps and tformers are working
nothing is burned/charred
12V output rail (it has a 12V and 24V output) is driving about 12.2-12.3v unloaded
24V output rail is driving about 0V eek
however it still powers on and works just fine, apparently in spite of that 24V rail (even though that 24V rail contributes something like 70% of its rated power consumption, we didn't leave it hooked up on a bench for hours to see if the rail is coming on and off, but the hour or two we had it open, it never gave us power)
so I'm thinking, well damn, the PSU is borked, but that can be replaced
here's where it gets WEIRD:
so I turn it on tonight, after the last 3 days of ZERO flickering, and it starts up, and I wasn't in the mood to ignore it, given that I had some work to do, and I didn't feel like switching to my laptop or another monitor, so I thought, hrmmm, what happens if I drop the resolution (its normally 2560x1600), put it to 1920x1200 and BAM! -> ZERO FLICKER, ZERO ISSUES
so I started thinking about: what have I been doing with it for the last 3 days and not having flicker issues? mostly watching TV or movies, which means 720p or 1080p via HDMI and YPbPr
the only other thing its done is Fallout 3 @ 1600p
so what I'm wondering is, why is there zero flicker when the resoultion goes down a notch, and why do certain applications seem to "cure" the flicker (also, the flicker will sometimes only present with specific applications on the desktop @ full res (it seems to HATE pidgin/anything else using GTK, somewhat dislike Opera, and have zero problems with Outlook rolleyes)
and yeah I thought about, well maybe the flicker is hard to see with motion on the screen, although if you feed static images into it (like solid white background) @ 1600p, you get some very mild flicker (like fullscreen pidgin -> flicker, grey/white background -> not so much), at 720/1080/1200p its impossible to accomplish it, and in FO3 its impossible to accomplish it (either with the main menus, the config screens, or the pipboy (in game menu) open)
tl:dr version:
why has my monitor broken itself to be more power efficient, is it gonna progress to arcs and sparks
and
why does my monitor hate certain applications, or why do certain applications hate my monitor
also, yes, I've tried the alternate DVI freq and reduced DVI freq from the ATI CP already, tried all 4 variations of that, seems to have no effect on the monitor at all (it doesn't seem to change anything)
anyone?
I'm dealing with a Gateway XHD3000 though, and its gotten itself into a somewhat interesting predicament
at first I thought, oh god, its failed, like the whining guy on youtube with the green lines, although I don't think thats the case, given that it "works" fine
so basically what happens is, given a random period of time (sometimes right when its turned on, sometimes after its been on for a while, this seems to be related to temperature more than anything else (if it gets fairly warm or if the room is fairly cool (we're talking like 60* F, not like 20* F, yes I realize it has a minimum operational temperature)) it'll start to VERY mildly flicker, I've got an ancient 17" LCD that does this when the room gets cold enough, and after enough time it warms up and stops, the Gateway seems to do this (if its started when the room is cold, or if its shut off for a while) most of the time, although sometimes it just comes and goes
so I was thinking, well, the power brick is external, so I can rip it open and have a peek, so with a friend and a decent multi, we ripped her open and had a peek, roughly what we found:
all caps and tformers are working
nothing is burned/charred
12V output rail (it has a 12V and 24V output) is driving about 12.2-12.3v unloaded
24V output rail is driving about 0V eek
however it still powers on and works just fine, apparently in spite of that 24V rail (even though that 24V rail contributes something like 70% of its rated power consumption, we didn't leave it hooked up on a bench for hours to see if the rail is coming on and off, but the hour or two we had it open, it never gave us power)
so I'm thinking, well damn, the PSU is borked, but that can be replaced
here's where it gets WEIRD:
so I turn it on tonight, after the last 3 days of ZERO flickering, and it starts up, and I wasn't in the mood to ignore it, given that I had some work to do, and I didn't feel like switching to my laptop or another monitor, so I thought, hrmmm, what happens if I drop the resolution (its normally 2560x1600), put it to 1920x1200 and BAM! -> ZERO FLICKER, ZERO ISSUES
so I started thinking about: what have I been doing with it for the last 3 days and not having flicker issues? mostly watching TV or movies, which means 720p or 1080p via HDMI and YPbPr
the only other thing its done is Fallout 3 @ 1600p
so what I'm wondering is, why is there zero flicker when the resoultion goes down a notch, and why do certain applications seem to "cure" the flicker (also, the flicker will sometimes only present with specific applications on the desktop @ full res (it seems to HATE pidgin/anything else using GTK, somewhat dislike Opera, and have zero problems with Outlook rolleyes)
and yeah I thought about, well maybe the flicker is hard to see with motion on the screen, although if you feed static images into it (like solid white background) @ 1600p, you get some very mild flicker (like fullscreen pidgin -> flicker, grey/white background -> not so much), at 720/1080/1200p its impossible to accomplish it, and in FO3 its impossible to accomplish it (either with the main menus, the config screens, or the pipboy (in game menu) open)
tl:dr version:
why has my monitor broken itself to be more power efficient, is it gonna progress to arcs and sparks
and
why does my monitor hate certain applications, or why do certain applications hate my monitor
also, yes, I've tried the alternate DVI freq and reduced DVI freq from the ATI CP already, tried all 4 variations of that, seems to have no effect on the monitor at all (it doesn't seem to change anything)
anyone?