I have an ATI X1300 with a Gateway 1680x1050 monitor attached to the DVI port and a Samsung 1920x1080 TV attached to the VGA port. Under Windows XP, both displays worked fine at their native resolution. (And defaulted to it too.) After installing 7, the Gateway worked fine but the Samsung came up with "mode not supported". I can set it as high as 1600x1200 (which I wouldn't think would work, because it doesn't have 1200 lines!) and it works, but going to 1920x1080 doesn't work. On a whim, I tried installing the Catalyst Control Center. During the install process, it set the TV to a working 1920x1080. Great, I thought. But when I rebooted, I found that the TV now came up in 1024x768, and any attempt to set it back to 1920x1080 again doesn't work. What the hell is going on? This is all the exact same hardware that worked fine with the OUTDATED OS, so shouldn't it work fine with the new one? Anybody know how I can force the TV to the correct resolution? Since it even handles being fed more lines that it has, I thought that maybe it was just getting the wrong refresh rate, but unchecking the automatic option in CCC and setting it to 1920x1080 @ 60 still gives the "mode not supported" screen.