Windows 7 won't handle two displays correctly

Jim S.

n00b
Joined
Dec 23, 2005
Messages
59
I have an ATI X1300 with a Gateway 1680x1050 monitor attached to the DVI port and a Samsung 1920x1080 TV attached to the VGA port. Under Windows XP, both displays worked fine at their native resolution. (And defaulted to it too.) After installing 7, the Gateway worked fine but the Samsung came up with "mode not supported". I can set it as high as 1600x1200 (which I wouldn't think would work, because it doesn't have 1200 lines!) and it works, but going to 1920x1080 doesn't work. On a whim, I tried installing the Catalyst Control Center. During the install process, it set the TV to a working 1920x1080. Great, I thought. But when I rebooted, I found that the TV now came up in 1024x768, and any attempt to set it back to 1920x1080 again doesn't work. What the hell is going on? This is all the exact same hardware that worked fine with the OUTDATED OS, so shouldn't it work fine with the new one? Anybody know how I can force the TV to the correct resolution? Since it even handles being fed more lines that it has, I thought that maybe it was just getting the wrong refresh rate, but unchecking the automatic option in CCC and setting it to 1920x1080 @ 60 still gives the "mode not supported" screen.
 
You could switch display outputs...run VGA to the monitor and DVI/HDMI on the TV..

If not you might want to try for the reduced blanking on 1080p VGA signal to the TV
 
this is a driver issue on an old card. i know for a fact that win7 can handle it just fine as i run 1600x1200 on one monitor and 1680x1050 on my other. im willing to bet that a fix is a long way off to due to amd dropping WDDM 1.1 support from all non DX10+ cards.
 
If not you might want to try for the reduced blanking on 1080p VGA signal to the TV

Where is that option? I remember hearing the term, but I can't find it in the controls anymore.

I saw that they weren't supporting this card anymore when I went to download CCC, but it's infuriating that it worked with an OLDER version of Windows. I shouldn't have to upgrade to a fancy gaming card when I don't game! All I need is dual outputs, the only reason I went as high as the X1300 is because my motherboard only had PCI Express.
 
no need for a fancy gaming card at all.
http://www.newegg.com/Product/Produ...ption=&Ntk=&CFG=&SpeTabStoreType=&srchInDesc=

any of those $25-$50 cards will work fine with everything. hardware cannot be supported forever or we can never get past any limitations.

I would avoid any of those cards that don't have dual dvi. VGA is pretty much obsolete.
Also, vga doesn't support 1080i afaik. I have heard that it supports 1080p, but I wouldn't even bother. Just go dvi.
 
I would avoid any of those cards that don't have dual dvi. VGA is pretty much obsolete.
Also, vga doesn't support 1080i afaik. I have heard that it supports 1080p, but I wouldn't even bother. Just go dvi.

while this is not bad advice, VGA will work fine for his needs and as long as the TV gets the dvi off the card i dont see a problem with it.
 
Have you tried using the latest Vista driver from ATI?

Though it is true that cards leading up to the x1900 series have been discontinued in Windows 7 driver releases, ATI had noted that support will continue in their Windows Vista drivers which work just fine in Windows 7....
 
Back
Top