Please help! Non-native resolutions on an LCD.

Keetha

Limp Gawd
Joined
Aug 16, 2004
Messages
356
I've heard things about games and desktops not looking as good on LCDs when displayed in non-native resolution. Is this true? If so, what exactly is wrong with them? I ask becasue I am pretty much decided on the 2005FPW, but was wondering about what I was going to do when the newer games came out. I like to play games with graphics maxed, or near maxed, but with these newer games and my older hardware, I would have to lower the details. I just wanted to lower resolution, and keep the details, but I don't know if this possible, or if it is, what side effects there will be. Any help wil be appreciated.
 
Sounds like you may need a new video card as well. If you lower the res on the 2005 you will probably have to have it stretch to fit to the screen size or have the black bars on the sides. I have noticed when running an LCD out of its native res it is like having very old hardware anyway, with all the jerking and stutering you will get. I am in the same boat though i want that monitor bad but I don't know if my 6600GT can handle it like I will want but that is why there is always bigger and better so you can have something to shoot for. Maybe the 6800GT will come down in cost someday soon :eek:
 
Non-native resolutions generally don't look that great on LCDs. I believe the quality of the interpolation varies among models, but in general they are designed to be used at native resolutions.
 
My brother has a Hyundai L90D+ 19" LCD and with his 9800PRO flashed to XT he plays CS: S at 1280x1024 (native res) with no framerate slowdowns.
 
Part of having a fancy LCD is keeping your hardware current to play at those high resolutions. I've heard interpolation isn't so bad these days, you notice it mostly in text, not so much in games, but that's only second hand info from my gaming friends. I've never compared the quality of interpolated resolutions to native myself.
 
Well, I have a 6800GT now (410/1100) so I should be good for current games, but I was just thinking like next year when they release Unreal 3, or w/e they release it, I want everything to be maxed and then just lower my resolution in the game only. I have one semi-vote that it isn't too noticable in games, anyone else have an opinion?

And what exactly is interpolation?
 
winston856 said:
My brother has a Hyundai L90D+ 19" LCD and with his 9800PRO flashed to XT he plays CS: S at 1280x1024 (native res) with no framerate slowdowns.

That monitor is good for gaming BAD for watching movies.
 
Well, I can't see any reason for me to be watching DVDs on my comp so I'm ok there.
 
So can anyone please explain this whole interpolation in games to me please?
 
Well, as far as I know LCD monitors are manufactured at a set resolution, their screen is exactly as wide and tall as their 'native' resolution. When you select another resolution below this, the screen has to readjust and stretch less pixels across the same area. This causes the desktop to be of less quality and therefore people don't usually do it unless they're in a pinch and they don't have the graphics card that can cope with the resolution.

Hope this helps. :)
 
Thanks for the input winston. That's pretty much what I've gathered so far. My question is this.... How "less quality" does it get? Is it barely noticable, noticable, horrible, ugly, run away screaming, or what?
 
Keetha said:
Thanks for the input winston. That's pretty much what I've gathered so far. My question is this.... How "less quality" does it get? Is it barely noticable, noticable, horrible, ugly, run away screaming, or what?


Well now on the first LCD's that were out, it was like night and day. But now since the LCD technology has improved so much, it's really not that noticable at all. You still do notice it, but it's not *that* bad. :)
 
The reason for a native resolution is that an LCD panel is manufactured with a specific number of pixels. Obviously the display looks its best when the input signal is for that many pixels. Interpolation is when the input signal's resolution doesn't match the monitor's native resolution, and so the monitor basically has to guess (interpolate) which pixels to make which color to make the image appear the best. For example, if I have a black screen with a white line running down the middle of it, but because I'm not running the native resolution the white line technically falls down the boundary between 2 columns of pixels, the monitor has to decide what to do with that signal. Should it display just one column as white? If so, which one? Should it display them both as white? Or maybe neither?

Obviously this has to be done dynamically by some form of control system in the monitor. Some manufacturers have a very good control system that handles interpolation well, and so their monitors interpolate better than other manufacturers' crappy control systems that don't interpolate nearly as well.
 
like ppl have said, it depends on the brand, but generally it's pretty bad. Very noticable to anyone that looks a native res, then at a screen that's at non-native res. Even with the best interpolation tech in a monitor, it's gonna be noticable.

according to a lot of people on these forums, Dells usually do it the best. I use Dell UltraSharp 17" screens in the computer lab at my college and they're set at 1024x768 (God only knows why) and it looks horrible IMHO. I set them to native res and it look 10x better.
 
Darth Bagel said:
The reason for a native resolution is that an LCD panel is manufactured with a specific number of pixels. Obviously the display looks its best when the input signal is for that many pixels. Interpolation is when the input signal's resolution doesn't match the monitor's native resolution, and so the monitor basically has to guess (interpolate) which pixels to make which color to make the image appear the best. For example, if I have a black screen with a white line running down the middle of it, but because I'm not running the native resolution the white line technically falls down the boundary between 2 columns of pixels, the monitor has to decide what to do with that signal. Should it display just one column as white? If so, which one? Should it display them both as white? Or maybe neither?

Obviously this has to be done dynamically by some form of control system in the monitor. Some manufacturers have a very good control system that handles interpolation well, and so their monitors interpolate better than other manufacturers' crappy control systems that don't interpolate nearly as well.

Thank you for clarifying what I was trying to say :p
 
In my experience, text looks like crap on an LCD not running in its native res (or an even divisor of its native res). Games don't suffer as much, unless they have lots of text or lines(Diablo II, rpgs with inventory). FPS/Action games seem to suffer the least from interpolation (but again, you'll still see it in any text). I imagine that the higher the native res on the LCD, the better the interpolation will look. Dell's seems pretty good to me (on the 2001fp anyway).

If you're worried about future games, do what i did, get a big LCD, and either run at the LCD's native res (1600x1200) with less eye candy, or half of it (800x600) with all the eye candy. :D
 
Back
Top