Windows 7 CRT support sucks? Can't even get max res

shurcooL

[H]ard|Gawd
Joined
Oct 12, 2007
Messages
1,126
It seems to be a given that 7 seems to have crappy support for refresh rates any other than 60 or 75.

Obviously none of the old and proven apps like ReForce or others work to fix it. So you're pretty much stuck using 75 hz on all resolutions.

In addition to that, I can't even get to the highest resolution on my 21" CRT.

I can get to 2048x1536@60hz and 1920x1440@72hz and 1920x1200@85hz with little problems on XP (same machine).

But I've had absolutely no luck in being able to select anything higher than 1600x1200 on the CRT in 7.

Does anyone know any way to force-enable higher resolutions? They're not listed under "All Modes" either.

Specs:
Windows 7 RC 32-bit
8800 GTX with latest drivers for 32-bit Windows 7 (185.85)
24" LG L246WP connected via DVI
21" Sun CRT (essentially a rebranded Sony Trinitron; I got it for free) connected via VGA

I had the LCD disabled and using only the CRT when trying this stuff. I know on XP I can't get past 1920x1200 on CRT when both monitors are enabled (weird, huh).

It's a shame, cuz I was really liking it otherwise. But this is a huge turn off for me. I know I wouldn't be using those resolutions everyday, but I do wanna mess around with them every now and then (especially for games).
 
Last edited:
If it makes you feel any better, 1600x1200 is the typical resolution on a CRT of that size. Compare it to any conventional LCD monitor at 20". They are either 1600x1200 or 1680x1050.
 
Thanks, but unfortunately that doesn't make me feel better. I would like to be able to get the most out of my hardware, so when a newer OS doesn't allow that, I tend to get disappointed.

Besides, I really like high density resolutions so that pixels become very small. In any case, I would only need to actually use that resolution <5% of the time, when I'm bored or curious to see how something would look at at that res, etc.
 
Most likely the Monitor doesn't have the technology that allows the OS to detect the modes its can support. I forget the acronym for it.

I would try and find out if there is a monitor driver, yes there is a monitor driver ;) that explains to the OS what modes the monitor supports. "Default Monitor" only has a subset of the most used modes and refresh frequencies. Hell even at 1600x1200 I bet it won't let you pass 60hz by default when your monitor probably supports 75 or 85hz even at 1920x1200.
 
Most likely the Monitor doesn't have the technology that allows the OS to detect the modes its can support. I forget the acronym for it.

I would try and find out if there is a monitor driver, yes there is a monitor driver ;) that explains to the OS what modes the monitor supports. "Default Monitor" only has a subset of the most used modes and refresh frequencies. Hell even at 1600x1200 I bet it won't let you pass 60hz by default when your monitor probably supports 75 or 85hz even at 1920x1200.

EDID
 
That sort of makes sense about EDID. Yet it works fine in XP as a "Default Analog Monitor".

Edit: I think the "Hide modes that this monitor display" checkmark = off is what unlocks the highest resolutions on XP. That, plus some ReForce registry hacks or whatever it does. But on Windows 7 that same checkmark is Forced On! It's checked, and grayed out (preventing me from unchecking it). Why...!

Any suggestions as to what monitor driver to download (and where to get it)?

I tried "Updating Driver" from Default PnP Monitor to a Sony CPD-G520 (I selected from the list of all the available monitor drivers), which is another 21" CRT that supports up to 2048x1568, but absolutely nothing changed other than that it was showing up as a G520. Rebooting didn't help.

You said you disabled the LCD, but have you tried completely unplugging it?
I suppose I should try this... But it'd be useless if it only worked when 1 monitor is plugged in.
 
Last edited:
I was having multiple crt problems with Win 7 and the 185 drivers. Dropping back to 182.50 fixed them. It's worth a try in your case if nothing else works.

My problem was that Win 7 with 185 drivers seemed to think my Misubishi 2070SB needed to be run at 1600x1200 all the time. I could run 85 hz ok. I'd tell it to run 1280x960 and at one point I'd get that resolution, but it would look "scaled", not sharp at all. Checking the OSD for the monitor would show that it was actually still running at 1600x1200. At some point this scaling went away and I'd just get a smaller 1280x960 window in the middle of the display. Things run perfect with the 182.50 drivers. Your problem doesn't seem to be the same, but it's worth a try.
 
you can also edit the registry to add the modes suppoted by your monitor. if XP can properly detect the monitor by 7 cannot, it is most likely a faulty driver.
 
Windows 7 defaulted with my CRT's max resolution: Link

The problem im having is that some games are STUCK in this resolution no matter what I do, I can change the desktop resolution no problem but for example if I try to change the resolution in Far Cry 2 it will say the resolution is changed but my monitor is still in 2304x1440 regardless. Any fixes for this?

Thanks.

PC: Win 7 RC build 7100 / Latest Win 7 drivers from Nvidia side (185.85) / EVGA GTX 260.
 
I'm curious how this works out also...I've got my FW900 working fine most of the time (EDIT: on Visa...I still have issues with a few DirectX games...can't get Civ4 to run at 85hz even with ATI Tray Tools / refreshlock /. etc)...need to know that decent CRT support will still exist.
 
Last edited:
subscribed. although vista is working fine for me, win 7 better be able to support my fw900 without issue before i even think about switching...
 
I think he's saying: "get a lcd." But I dunno, I like my CRT, it's trouble free and works (in Vista) so I hope this gets sorted out, cuz I noticed I also couldn't get 100hz in Win7, but I could get 1280x1024x85 which is what I like these days anyways, so no biggie.


i know, i was just being an ass back...he should be banned for threadcrapping.
 
I have 2 x 21" CRT's running each at 1600x1200@85Hz with no problems in Win7 /shrug

I have a 22" Mitsubishi Diamond Pro 2060u CRT running 1600x1200 @ 85Hz. Problem is, it should be running at 100Hz. Had no problems running @ 100 on XP. Vista 64 & Win7 64 cap it at 85. I'm assuming it's a driver issue as my driver hasn't been updated in 3-4 years. Oh well.
 
Instead of running at 1600x1200, try running at something different like 1280x960. With the 185/186 drivers you'll likely get "scaled" 1280x960, where it looks like fuzzy 1280x960 but the monitor is still actually running at 1600x1200. Sometimes instead of scaled you'll just get the 1280x960 unscaled in the middle of the 1600x1200. The 182.50 drivers work ok. This doesn't happen in Vista64 with any of the current drivers, just Win 7.
 
anyone find a solution to this? having trouble getting a 24" LCD to work off a VGA port on my laptop above 1600x1200. nvidia custom resolution does not work. nothing works. grrrr
 
bradsh, shurcool, can you take a dispdiag from the command line and host it? I will take a look at your modes and see if anything is getting pruned that shouldn't be.
 
Umm, it has been quite a long time ago, so I'm not sure if I can do it now. But I'll try to do it later.
 
Umm, it has been quite a long time ago, so I'm not sure if I can do it now. But I'll try to do it later.

Sorry about that, didn't realize that it was raised via necro post. Does the issue still occur?
 
Well, I've kinda given up on Windows 7 RC on my desktop (but I do use 7 on my laptops), so I'm still using XP there. I haven't had a lot of time to play any games lately, so the CRT isn't being used much anyway.

In other words, no, I haven't been able to get my CRT to work as smoothly under 7 RC as it does under XP.
 
It seems to be a given that 7 seems to have crappy support for refresh rates any other than 60 or 75.

Interpreting the data from the monitor is the responsibility of your video driver and video card. The video driver supplied by your vid card manufacturer sucks.
 
The video driver supplied by your vid card manufacturer sucks.
But the same vid card manufacturer manages to make decent drivers for XP that do support the higher resolutions properly (at least, I was able to get it to work, and it was much less of a hassle).

I'd say Windows 7 plays at least some role in this, even if indirectly.
 
Interpreting the data from the monitor is the responsibility of your video driver and video card. The video driver supplied by your vid card manufacturer sucks.
You know what, on second thought, I don't think that's right.

At least, that's only half of the ball game.

The other half is the monitor driver that says what the monitor supports.

The video card driver can only go with what *both* the video card and the monitor support. Even if the video card supports 160 hz, it can't do it unless the monitor can accept it.

So the problem is likely in the Windows 7 *default* CRT monitor driver. It plays it safe and limits everything to 60/75 hz.

Whereas the default CRT monitor driver in XP allows you to do whatever the hell you want.
 
You know what, on second thought, I don't think that's right.

At least, that's only half of the ball game.

The other half is the monitor driver that says what the monitor supports.

The video card driver can only go with what *both* the video card and the monitor support. Even if the video card supports 160 hz, it can't do it unless the monitor can accept it.

So the problem is likely in the Windows 7 *default* CRT monitor driver. It plays it safe and limits everything to 60/75 hz.

Whereas the default CRT monitor driver in XP allows you to do whatever the hell you want.

You're partially correct, TMM in Vista and CCD in Win7 handle the OS side of mode pruning. In XP, the driver was entirely responsible. A combination of factors goes into the pruning and display of modes. You can search the web for it (try, for example, "wddm mode pruning" without the quotes) and find a number of posts about how it works.

Did you ever check with your monitor's OSD in XP to see if your monitor was actually at 100Hz?
 
You're partially correct, TMM in Vista and CCD in Win7 handle the OS side of mode pruning. In XP, the driver was entirely responsible. A combination of factors goes into the pruning and display of modes. You can search the web for it (try, for example, "wddm mode pruning" without the quotes) and find a number of posts about how it works.
I see. That makes sense.

Did you ever check with your monitor's OSD in XP to see if your monitor was actually at 100Hz?
Of course! I've done much more than that. :p

The max I can get in XP using my CRT is 150 Hz, and it's very noticeable. When moving the mouse around, the distance between two successive mouse pointer positions becomes much less. But it's even easier to see the effect of 150 Hz vs. 60 Hz when you fire up any first person shooter that runs at 150 fps and try looking around. It is sooo much more smooth at 150! :D
 
I see. That makes sense.


Of course! I've done much more than that. :p

The max I can get in XP using my CRT is 150 Hz, and it's very noticeable. When moving the mouse around, the distance between two successive mouse pointer positions becomes much less. But it's even easier to see the effect of 150 Hz vs. 60 Hz when you fire up any first person shooter that runs at 150 fps and try looking around. It is sooo much more smooth at 150! :D

How in the world would increasing the monitor's refresh rate decrease the amount a the pointer moves? They're completely independent actions.

The FPS thing, of course, more frames can be displayed by the monitor for an action sequence, and thus it would appear smoother.
 
I see. That makes sense.


Of course! I've done much more than that. :p

The max I can get in XP using my CRT is 150 Hz, and it's very noticeable. When moving the mouse around, the distance between two successive mouse pointer positions becomes much less. But it's even easier to see the effect of 150 Hz vs. 60 Hz when you fire up any first person shooter that runs at 150 fps and try looking around. It is sooo much more smooth at 150! :D

I'd have to test this assertion in a double-blind fashion - randomly have your program choose between a 60Hz, 75Hz and 150Hz rate and different framerates, labeling each with "A" "B" "C" etc... Compare any two, rank as better, worse, or no difference. Repeat numerous times and see if there's a statistical correlation.

I also have to question what you're saying because 1) many or most FPSes cap out below 150FPS, 2) Most people play FPS with vsync off, so there will be tearing, and 3) mouse distance is such that even if you could draw 150FPS, it'd still skip across the screen faster than could be tracked, as traversing my screen(s) means traversing > 2000 pixels, and it's done in a fraction of a second.
 
How in the world would increasing the monitor's refresh rate decrease the amount a the pointer moves? They're completely independent actions.
It doesn't make the pointer move slower. It just moves less between successive frames.

At 60 hz, the time between two successive frames is 1/60 of a second, so the pointer moves so much.

At 150 hz, the time between two successive frames is much less, so in that time, the pointer moves only a little, so the next frame shows the cursor much closer to the previous frame.

So it just makes the cursor motion seem much more consistent and smooth.
 
Back
Top